Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Survival analysis: Part I — analysis of time-to-event
2018-01-01
Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Research on Visual Analysis Methods of Terrorism Events
NASA Astrophysics Data System (ADS)
Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing
2016-06-01
Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.
Multi-scale comparison of source parameter estimation using empirical Green's function approach
NASA Astrophysics Data System (ADS)
Chen, X.; Cheng, Y.
2015-12-01
Analysis of earthquake source parameters requires correction of path effect, site response, and instrument responses. Empirical Green's function (EGF) method is one of the most effective methods in removing path effects and station responses by taking the spectral ratio between a larger and smaller event. Traditional EGF method requires identifying suitable event pairs, and analyze each event individually. This allows high quality estimations for strictly selected events, however, the quantity of resolvable source parameters is limited, which challenges the interpretation of spatial-temporal coherency. On the other hand, methods that exploit the redundancy of event-station pairs are proposed, which utilize the stacking technique to obtain systematic source parameter estimations for a large quantity of events at the same time. This allows us to examine large quantity of events systematically, facilitating analysis of spatial-temporal patterns, and scaling relationship. However, it is unclear how much resolution is scarified during this process. In addition to the empirical Green's function calculation, choice of model parameters and fitting methods also lead to biases. Here, using two regional focused arrays, the OBS array in the Mendocino region, and the borehole array in the Salton Sea geothermal field, I compare the results from the large scale stacking analysis, small-scale cluster analysis, and single event-pair analysis with different fitting methods to systematically compare the results within completely different tectonic environment, in order to quantify the consistency and inconsistency in source parameter estimations, and the associated problems.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Kahan, Brennan C; Harhay, Michael O
2015-12-01
Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi
2017-01-01
Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814
A comparison of analysis methods to estimate contingency strength.
Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T
2018-05-09
To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.
EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.
Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan
2018-01-01
Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
The pros and cons of researching events ethnographically
2017-01-01
Events (remarkable, disruptive happenings) are important subjects of study for understanding processes of change. In this essay, I reflect upon the issue of what the ethnographic method has to offer for the analysis of this social phenomenon. To do so, I review three recently published ethnographic studies of events. My conclusion is that it is indeed a very useful method for understanding the feelings and ideas of people who are experiencing eventful situations, for instance around protests or natural disasters. However, using this method also brings about practical difficulties, such as the ‘luck’ that an event occurs at the ethnographic fieldwork site. Next, as transformative responses to events are not bound by the place or time of the happening, other methods (interviews, discourse analysis, surveys) that make it easier to follow them in varying locations and periods might be more suitable for getting a comprehensive picture of their meaning-making dynamics. PMID:29081715
Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi
2018-04-15
Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.
Bender, Ralf; Beckmann, Lars; Lange, Stefan
2016-07-01
The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.
2012-04-01
This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content of such reservoirs; both in onshore regions as well as in offshore regions. Drilling a well is always guided by technical, economic and security constraints to prevent crew, equipment and environment from injury, damage and pollution. Although risk assessment and local practice provides a high degree of security, uncertainty is given by the behaviour of the formation which may cause crucial situations at the rig. To overcome such uncertainties real-time sensor measurements form a base to predict and thus prevent such crises, the proposed method supports the identification of the data necessary for that.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.
Lilly, Jonathan M
2017-04-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
Combining conversation analysis and event sequencing to study health communication.
Pecanac, Kristen E
2018-06-01
Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.
Detection and analysis of a transient energy burst with beamforming of multiple teleseismic phases
NASA Astrophysics Data System (ADS)
Retailleau, Lise; Landès, Matthieu; Gualtieri, Lucia; Shapiro, Nikolai M.; Campillo, Michel; Roux, Philippe; Guilbert, Jocelyn
2018-01-01
Seismological detection methods are traditionally based on picking techniques. These methods cannot be used to analyse emergent signals where the arrivals cannot be picked. Here, we detect and locate seismic events by applying a beamforming method that combines multiple body-wave phases to USArray data. This method explores the consistency and characteristic behaviour of teleseismic body waves that are recorded by a large-scale, still dense, seismic network. We perform time-slowness analysis of the signals and correlate this with the time-slowness equivalent of the different body-wave phases predicted by a global traveltime calculator, to determine the occurrence of an event with no a priori information about it. We apply this method continuously to one year of data to analyse the different events that generate signals reaching the USArray network. In particular, we analyse in detail a low-frequency secondary microseismic event that occurred on 2010 February 1. This event, that lasted 1 d, has a narrow frequency band around 0.1 Hz, and it occurred at a distance of 150° to the USArray network, South of Australia. We show that the most energetic phase observed is the PKPab phase. Direct amplitude analysis of regional seismograms confirms the occurrence of this event. We compare the seismic observations with models of the spectral density of the pressure field generated by the interferences between oceanic waves. We attribute the observed signals to a storm-generated microseismic event that occurred along the South East Indian Ridge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bley, D.C.; Cooper, S.E.; Forester, J.A.
ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.
NASA Astrophysics Data System (ADS)
Otto, Friederike E. L.; van der Wiel, Karin; van Oldenborgh, Geert Jan; Philip, Sjoukje; Kew, Sarah F.; Uhe, Peter; Cullen, Heidi
2018-02-01
On 4-6 December 2015, storm Desmond caused very heavy rainfall in Northern England and Southern Scotland which led to widespread flooding. A week after the event we provided an initial assessment of the influence of anthropogenic climate change on the likelihood of one-day precipitation events averaged over an area encompassing Northern England and Southern Scotland using data and methods available immediately after the event occurred. The analysis was based on three independent methods of extreme event attribution: historical observed trends, coupled climate model simulations and a large ensemble of regional model simulations. All three methods agreed that the effect of climate change was positive, making precipitation events like this about 40% more likely, with a provisional 2.5%-97.5% confidence interval of 5%-80%. Here we revisit the assessment using more station data, an additional monthly event definition, a second global climate model and regional model simulations of winter 2015/16. The overall result of the analysis is similar to the real-time analysis with a best estimate of a 59% increase in event frequency, but a larger confidence interval that does include no change. It is important to highlight that the observational data in the additional monthly analysis does not only represent the rainfall associated with storm Desmond but also that of storms Eve and Frank occurring towards the end of the month.
Statistical analysis of life history calendar data.
Eerola, Mervi; Helske, Satu
2016-04-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions
NASA Technical Reports Server (NTRS)
Mcguire, Stephen C.
1987-01-01
The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.
Lognormal Approximations of Fault Tree Uncertainty Distributions.
El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P
2018-01-26
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.
Revisiting a Meta-Analysis of Helpful Aspects of Therapy in a Community Counselling Service
ERIC Educational Resources Information Center
Quick, Emma L; Dowd, Claire; Spong, Sheila
2018-01-01
This small scale mixed methods study examines helpful events in a community counselling setting, categorising impacts of events according to Timulak's [(2007). Identifying core categories of client-identified impact of helpful events in psychotherapy: A qualitative meta-analysis. "Psychotherapy Research," 17, 305-314] meta-synthesis of…
Why conventional detection methods fail in identifying the existence of contamination events.
Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han
2016-04-15
Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
Interval sampling methods and measurement error: a computer simulation.
Wirth, Oliver; Slaven, James; Taylor, Matthew A
2014-01-01
A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.
Methods for improved forewarning of critical events across multiple data channels
Hively, Lee M [Philadelphia, TN
2007-04-24
This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.
A substitution method to improve completeness of events documentation in anesthesia records.
Lamer, Antoine; De Jonckheere, Julien; Marcilly, Romaric; Tavernier, Benoît; Vallet, Benoît; Jeanne, Mathieu; Logier, Régis
2015-12-01
AIMS are optimized to find and display data and curves about one specific intervention but is not retrospective analysis on a huge volume of interventions. Such a system present two main limitation; (1) the transactional database architecture, (2) the completeness of documentation. In order to solve the architectural problem, data warehouses were developed to propose architecture suitable for analysis. However, completeness of documentation stays unsolved. In this paper, we describe a method which allows determining of substitution rules in order to detect missing anesthesia events in an anesthesia record. Our method is based on the principle that missing event could be detected using a substitution one defined as the nearest documented event. As an example, we focused on the automatic detection of the start and the end of anesthesia procedure when these events were not documented by the clinicians. We applied our method on a set of records in order to evaluate; (1) the event detection accuracy, (2) the improvement of valid records. For the year 2010-2012, we obtained event detection with a precision of 0.00 (-2.22; 2.00) min for the start of anesthesia and 0.10 (0.00; 0.35) min for the end of anesthesia. On the other hand, we increased by 21.1% the data completeness (from 80.3 to 97.2% of the total database) for the start and the end of anesthesia events. This method seems to be efficient to replace missing "start and end of anesthesia" events. This method could also be used to replace other missing time events in this particular data warehouse as well as in other kind of data warehouses.
2012-01-01
Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987
Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nowlen, Steven Patrick; Hyslop, J. S.
2010-04-01
Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements thatmore » could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.« less
Event time analysis of longitudinal neuroimage data.
Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce
2014-08-15
This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
Detection of Abnormal Events via Optical Flow Feature Analysis
Wang, Tian; Snoussi, Hichem
2015-01-01
In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227
Job optimization in ATLAS TAG-based distributed analysis
NASA Astrophysics Data System (ADS)
Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.
2010-04-01
The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.
Detection of goal events in soccer videos
NASA Astrophysics Data System (ADS)
Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas
2005-01-01
In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.
An unjustified benefit: immortal time bias in the analysis of time-dependent events.
Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg
2018-02-01
Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.
Joint Attributes and Event Analysis for Multimedia Event Detection.
Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G
2017-06-15
Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.
Heller, Gabriella T; Zwang, Theodore J; Sarapata, Elizabeth A; Haber, Michael A; Sazinsky, Matthew H; Radunskaya, Ami E; Johal, Malkiat S
2014-05-01
Previous methods for analyzing protein-ligand binding events using the quartz crystal microbalance with dissipation monitoring (QCM-D) fail to account for unintended binding that inevitably occurs during surface measurements and obscure kinetic information. In this article, we present a system of differential equations that accounts for both reversible and irreversible unintended interactions. This model is tested on three protein-ligand systems, each of which has different features, to establish the feasibility of using the QCM-D for protein binding analysis. Based on this analysis, we were able to obtain kinetic information for the intended interaction that is consistent with those obtained in literature via bulk-phase methods. In the appendix, we include a method for decoupling these from the intended binding events and extracting relevant affinity information. Copyright © 2014 Elsevier B.V. All rights reserved.
Phylogenetic inference under varying proportions of indel-induced alignment gaps
Dwivedi, Bhakti; Gadagkar, Sudhindra R
2009-01-01
Background The effect of alignment gaps on phylogenetic accuracy has been the subject of numerous studies. In this study, we investigated the relationship between the total number of gapped sites and phylogenetic accuracy, when the gaps were introduced (by means of computer simulation) to reflect indel (insertion/deletion) events during the evolution of DNA sequences. The resulting (true) alignments were subjected to commonly used gap treatment and phylogenetic inference methods. Results (1) In general, there was a strong – almost deterministic – relationship between the amount of gap in the data and the level of phylogenetic accuracy when the alignments were very "gappy", (2) gaps resulting from deletions (as opposed to insertions) contributed more to the inaccuracy of phylogenetic inference, (3) the probabilistic methods (Bayesian, PhyML & "MLε, " a method implemented in DNAML in PHYLIP) performed better at most levels of gap percentage when compared to parsimony (MP) and distance (NJ) methods, with Bayesian analysis being clearly the best, (4) methods that treat gapped sites as missing data yielded less accurate trees when compared to those that attribute phylogenetic signal to the gapped sites (by coding them as binary character data – presence/absence, or as in the MLε method), and (5) in general, the accuracy of phylogenetic inference depended upon the amount of available data when the gaps resulted from mainly deletion events, and the amount of missing data when insertion events were equally likely to have caused the alignment gaps. Conclusion When gaps in an alignment are a consequence of indel events in the evolution of the sequences, the accuracy of phylogenetic analysis is likely to improve if: (1) alignment gaps are categorized as arising from insertion events or deletion events and then treated separately in the analysis, (2) the evolutionary signal provided by indels is harnessed in the phylogenetic analysis, and (3) methods that utilize the phylogenetic signal in indels are developed for distance methods too. When the true homology is known and the amount of gaps is 20 percent of the alignment length or less, the methods used in this study are likely to yield trees with 90–100 percent accuracy. PMID:19698168
[Adverse events management. Methods and results of a development project].
Rabøl, Louise Isager; Jensen, Elisabeth Brøgger; Hellebek, Annemarie H; Pedersen, Beth Lilja
2006-11-27
This article describes the methods and results of a project in the Copenhagen Hospital Corporation (H:S) on preventing adverse events. The aim of the project was to raise awareness about patients' safety, test a reporting system for adverse events, develop and test methods of analysis of events and propagate ideas about how to prevent adverse events. H:S developed an action plan and a reporting system for adverse events, founded an organization and developed an educational program on theories and methods of learning from adverse events for both leaders and employees. During the three-year period from 1 January 2002 to 31 December 2004, the H:S staff reported 6011 adverse events. In the same period, the organization completed 92 root cause analyses. More than half of these dealt with events that had been optional to report, the other half events that had been mandatory to report. The number of reports and the front-line staff's attitude towards reporting shows that the H:S succeeded in founding a safety culture. Future work should be centred on developing and testing methods that will prevent adverse events from happening. The objective is to suggest and complete preventive initiatives which will help increase patient safety.
Eventogram: A Visual Representation of Main Events in Biomedical Signals.
Elgendi, Mohamed
2016-09-22
Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.
Jaspersen, Johannes G; Montibeller, Gilberto
2015-07-01
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. © 2015 Society for Risk Analysis.
Event by event analysis and entropy of multiparticle systems
NASA Astrophysics Data System (ADS)
Bialas, A.; Czyz, W.
2000-04-01
The coincidence method of measuring the entropy of a system, proposed some time ago by Ma, is generalized to include systems out of equilibrium. It is suggested that the method can be adapted to analyze multiparticle states produced in high-energy collisions.
The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport
NASA Astrophysics Data System (ADS)
Poliaková, Adela
2015-06-01
The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Activity recognition using Video Event Segmentation with Text (VEST)
NASA Astrophysics Data System (ADS)
Holloway, Hillary; Jones, Eric K.; Kaluzniacki, Andrew; Blasch, Erik; Tierno, Jorge
2014-06-01
Multi-Intelligence (multi-INT) data includes video, text, and signals that require analysis by operators. Analysis methods include information fusion approaches such as filtering, correlation, and association. In this paper, we discuss the Video Event Segmentation with Text (VEST) method, which provides event boundaries of an activity to compile related message and video clips for future interest. VEST infers meaningful activities by clustering multiple streams of time-sequenced multi-INT intelligence data and derived fusion products. We discuss exemplar results that segment raw full-motion video (FMV) data by using extracted commentary message timestamps, FMV metadata, and user-defined queries.
Element analysis: a wavelet-based method for analysing time-localized events in noisy time series
2017-01-01
A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325
The Frasnian-Famennian mass killing event(s), methods of identification and evaluation
NASA Technical Reports Server (NTRS)
Geldsetzer, H. H. J.
1988-01-01
The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.
Distributed video data fusion and mining
NASA Astrophysics Data System (ADS)
Chang, Edward Y.; Wang, Yuan-Fang; Rodoplu, Volkan
2004-09-01
This paper presents an event sensing paradigm for intelligent event-analysis in a wireless, ad hoc, multi-camera, video surveillance system. In particilar, we present statistical methods that we have developed to support three aspects of event sensing: 1) energy-efficient, resource-conserving, and robust sensor data fusion and analysis, 2) intelligent event modeling and recognition, and 3) rapid deployment, dynamic configuration, and continuous operation of the camera networks. We outline our preliminary results, and discuss future directions that research might take.
Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events
USDA-ARS?s Scientific Manuscript database
Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....
Investigating cardiorespiratory interaction by cross-spectral analysis of event series
NASA Astrophysics Data System (ADS)
Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen
2000-02-01
The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.
2010-12-01
Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation, calculated by assuming a uniform distribution of events in time. We generate a correlation score matrix, which indicates how weakly or strongly correlated each fault element is to every other in the course of the VC simulation. We calculate correlation scores by summing the difference between the actual and expected correlations over all time window lengths and normalizing by the time window size. The correlation score matrix can focus attention on the most interesting areas for more in-depth analysis of event correlation vs. time. The previous study included 59 faults (639 elements) in the model, which included all the faults save the creeping section of the San Andreas. The analysis spanned 40,000 yrs of Virtual California-generated earthquake data. The newly revised VC model includes 70 faults, 8720 fault elements, and spans 110,000 years. Due to computational considerations, we will evaluate the elements comprising the southern California region, which our previous study indicated showed interesting fault interaction and event triggering/quiescence relationships.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Evidence of validity of the Stress-Producing Life Events (SPLE) instrument.
Rizzini, Marta; Santos, Alcione Miranda Dos; Silva, Antônio Augusto Moura da
2018-01-01
OBJECTIVE Evaluate the construct validity of a list of eight Stressful Life Events in pregnant women. METHODS A cross-sectional study was conducted with 1,446 pregnant women in São Luís, MA, and 1,364 pregnant women in Ribeirão Preto, SP (BRISA cohort), from February 2010 to June 2011. In the exploratory factorial analysis, the promax oblique rotation was used and for the calculation of the internal consistency, we used the compound reliability. The construct validity was determined by means of the confirmatory factorial analysis with the method of estimation of weighted least squares adjusted by the mean and variance. RESULTS The model with the best fit in the exploratory analysis was the one that retained three factors with a cumulative variance of 61.1%. The one-factor model did not obtain a good fit in both samples in the confirmatory analysis. The three-factor model called Stress-Producing Life Events presented a good fit (RMSEA < 0.05; CFI/TLI > 0.90) for both samples. CONCLUSIONS The Stress-Producing Life Events constitute a second order construct with three dimensions related to health, personal and financial aspects and violence. This study found evidence that confirms the construct validity of a list of stressor events, entitled Stress-Producing Life Events Inventory.
NASA Astrophysics Data System (ADS)
Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Shiokawa, Kazuo
2014-08-01
We have developed a new analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow intensity image data to study atmospheric gravity waves. This method can deal with extensive amounts of imaging data obtained on different years and at various observation sites without bias caused by different event extraction criteria for the person processing the data. The new method was applied to sodium airglow data obtained in 2011 at Syowa Station (69°S, 40°E), Antarctica. The results were compared with those obtained from a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics, such as wavelengths, phase velocities, and wave periods. The horizontal phase velocity of each wave event in the airglow images corresponded closely to a peak in the spectrum. The statistical results of spectral analysis showed an eastward offset of the horizontal phase velocity distribution. This could be interpreted as the existence of wave sources around the stratospheric eastward jet. Similar zonal anisotropy was also seen in the horizontal phase velocity distribution of the gravity waves by the event analysis. Both methods produce similar statistical results about directionality of atmospheric gravity waves. Galactic contamination of the spectrum was examined by calculating the apparent velocity of the stars and found to be limited for phase speeds lower than 30 m/s. In conclusion, our new method is suitable for deriving the horizontal phase velocity characteristics of atmospheric gravity waves from an extensive amount of imaging data.
Austin, Peter C
2014-03-30
Propensity score methods are increasingly being used to estimate causal treatment effects in observational studies. In medical and epidemiological studies, outcomes are frequently time-to-event in nature. Propensity-score methods are often applied incorrectly when estimating the effect of treatment on time-to-event outcomes. This article describes how two different propensity score methods (matching and inverse probability of treatment weighting) can be used to estimate the measures of effect that are frequently reported in randomized controlled trials: (i) marginal survival curves, which describe survival in the population if all subjects were treated or if all subjects were untreated; and (ii) marginal hazard ratios. The use of these propensity score methods allows one to replicate the measures of effect that are commonly reported in randomized controlled trials with time-to-event outcomes: both absolute and relative reductions in the probability of an event occurring can be determined. We also provide guidance on variable selection for the propensity score model, highlight methods for assessing the balance of baseline covariates between treated and untreated subjects, and describe the implementation of a sensitivity analysis to assess the effect of unmeasured confounding variables on the estimated treatment effect when outcomes are time-to-event in nature. The methods in the paper are illustrated by estimating the effect of discharge statin prescribing on the risk of death in a sample of patients hospitalized with acute myocardial infarction. In this tutorial article, we describe and illustrate all the steps necessary to conduct a comprehensive analysis of the effect of treatment on time-to-event outcomes. © 2013 The authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Di Matteo, S.; Villante, U.
2017-05-01
The occurrence of waves at discrete frequencies in the solar wind (SW) parameters has been reported in the scientific literature with some controversial results, mostly concerning the existence (and stability) of favored sets of frequencies. On the other hand, the experimental results might be influenced by the analytical methods adopted for the spectral analysis. We focused attention on the fluctuations of the SW dynamic pressure (PSW) occurring in the leading edges of streams following interplanetary shocks and compared the results of the Welch method (WM) with those of the multitaper method (MTM). The results of a simulation analysis demonstrate that the identification of the wave occurrence and the frequency estimate might be strongly influenced by the signal characteristics and analytical methods, especially in the presence of multicomponent signals. In SW streams, PSW oscillations are routinely detected in the entire range f ≈ 1.2-5.0 mHz; nevertheless, the WM/MTM agreement in the identification and frequency estimate occurs in ≈50% of events and different sets of favored frequencies would be proposed for the same set of events by the WM and MTM analysis. The histogram of the frequency distribution of the events identified by both methods suggests more relevant percentages between f ≈ 1.7-1.9, f ≈ 2.7-3.4, and f ≈ 3.9-4.4 (with a most relevant peak at f ≈ 4.2 mHz). Extremely severe thresholds select a small number (14) of remarkable events, with a one-to-one correspondence between WM and MTM: interestingly, these events reveal a tendency for a favored occurrence in bins centered at f ≈ 2.9 and at f ≈ 4.2 mHz.
Event-based analysis of free-living behaviour.
Granat, Malcolm H
2012-11-01
The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.
A Comprehensive Seismic Characterization of the Cove Fort-Sulphurdale Geothermal Site, Utah
NASA Astrophysics Data System (ADS)
Zhang, H.; Li, J.; Zhang, X.; Liu, Y.; Kuleli, H. S.; Toksoz, M. N.
2012-12-01
The Cove Fort-Sulphurdale geothermal area is located in the transition zone between the extensional Basin and Range Province to the west and the uplifted Colorado Plateau to the east. The region around the geothermal site has the highest heat flow values of over 260 mWm-2 in Utah. To better understand the structure around the geothermal site, the MIT group deployed 10 seismic stations for a period of one year from August 2010. The local seismic network detected over 500 local earthquakes, from which ~200 events located within the network were selected for further analysis. Our seismic analysis is focused on three aspects: seismic velocity and attenuation tomography, seismic event focal mechanism analysis, and seismic shear wave splitting analysis. First P- and S-wave arrivals are picked manually and then the waveform cross-correlation technique is applied to obtain more accurate differential times between event pairs observed on common stations. The double-difference tomography method of Zhang and Thurber (2003) is used to simultaneously determine Vp and Vs models and seismic event locations. For the attenuation tomography, we first calculate t* values from spectrum fitting and then invert them to get Q models based on known velocity models and seismic event locations. Due to the limited station coverage and relatively low signal to noise ratio, many seismic waveforms do not have clear first P arrival polarities and as a result the conventional focal mechanism determination method relying on the polarity information is not applicable. Therefore, we used the full waveform matching method of Li et al. (2010) to determine event focal mechanisms. For the shear wave splitting analysis, we used the cross-correlation method to determine the delay times between fast and slow shear waves and the polarization angles of fast shear waves. The delay times are further taken to image the anisotropy percentage distribution in three dimensions using the shear wave splitting tomography method of Zhang et al. (2007). For the study region, overall the velocity is lower and attenuation is higher in the western part. Correspondingly, the anisotropy is also stronger, indicating the fractures may be more developed in the western part. The average fast polarization directions of fast shear waves at each station mostly point NNE. From the focal mechanism analysis from selected events, it shows that the normal faulting events have strikes in NNE direction, and the events with strike slip mechanism have strikes either parallel with the NNE trending faults or their conjugate ones. Assuming the maximum horizontal stress (SHmax) is parallel with the strike of the normal faulting events and bisects the two fault planes of the strike-slip events, the inverted source mechanism suggests a NNE oriented maximum horizontal stress regime. This area is under W-E tensional stress, which means maximum compressional stress should be in the N-E or NNE direction in general. The combination of shear wave splitting and focal mechanism analysis suggests that in this region the faults and fractures are aligned in the NNE direction.
Model-Based Adaptive Event-Triggered Control of Strict-Feedback Nonlinear Systems.
Li, Yuan-Xin; Yang, Guang-Hong
2018-04-01
This paper is concerned with the adaptive event-triggered control problem of nonlinear continuous-time systems in strict-feedback form. By using the event-sampled neural network (NN) to approximate the unknown nonlinear function, an adaptive model and an associated event-triggered controller are designed by exploiting the backstepping method. In the proposed method, the feedback signals and the NN weights are aperiodically updated only when the event-triggered condition is violated. A positive lower bound on the minimum intersample time is guaranteed to avoid accumulation point. The closed-loop stability of the resulting nonlinear impulsive dynamical system is rigorously proved via Lyapunov analysis under an adaptive event sampling condition. In comparing with the traditional adaptive backstepping design with a fixed sample period, the event-triggered method samples the state and updates the NN weights only when it is necessary. Therefore, the number of transmissions can be significantly reduced. Finally, two simulation examples are presented to show the effectiveness of the proposed control method.
GMDD: a database of GMO detection methods
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-01-01
Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755
Ju, Lining; Wang, Yijie Dylan; Hung, Ying; Wu, Chien-Fu Jeff; Zhu, Cheng
2013-01-01
Motivation: Abrupt reduction/resumption of thermal fluctuations of a force probe has been used to identify association/dissociation events of protein–ligand bonds. We show that off-rate of molecular dissociation can be estimated by the analysis of the bond lifetime, while the on-rate of molecular association can be estimated by the analysis of the waiting time between two neighboring bond events. However, the analysis relies heavily on subjective judgments and is time-consuming. To automate the process of mapping out bond events from thermal fluctuation data, we develop a hidden Markov model (HMM)-based method. Results: The HMM method represents the bond state by a hidden variable with two values: bound and unbound. The bond association/dissociation is visualized and pinpointed. We apply the method to analyze a key receptor–ligand interaction in the early stage of hemostasis and thrombosis: the von Willebrand factor (VWF) binding to platelet glycoprotein Ibα (GPIbα). The numbers of bond lifetime and waiting time events estimated by the HMM are much more than those estimated by a descriptive statistical method from the same set of raw data. The kinetic parameters estimated by the HMM are in excellent agreement with those by a descriptive statistical analysis, but have much smaller errors for both wild-type and two mutant VWF-A1 domains. Thus, the computerized analysis allows us to speed up the analysis and improve the quality of estimates of receptor–ligand binding kinetics. Contact: jeffwu@isye.gatech.edu or cheng.zhu@bme.gatech.edu PMID:23599504
An evaluation of computer-aided disproportionality analysis for post-marketing signal detection.
Lehman, H P; Chen, J; Gould, A L; Kassekert, R; Beninger, P R; Carney, R; Goldberg, M; Goss, M A; Kidos, K; Sharrar, R G; Shields, K; Sweet, A; Wiholm, B E; Honig, P K
2007-08-01
To understand the value of computer-aided disproportionality analysis (DA) in relation to current pharmacovigilance signal detection methods, four products were retrospectively evaluated by applying an empirical Bayes method to Merck's post-marketing safety database. Findings were compared with the prior detection of labeled post-marketing adverse events. Disproportionality ratios (empirical Bayes geometric mean lower 95% bounds for the posterior distribution (EBGM05)) were generated for product-event pairs. Overall (1993-2004 data, EBGM05> or =2, individual terms) results of signal detection using DA compared to standard methods were sensitivity, 31.1%; specificity, 95.3%; and positive predictive value, 19.9%. Using groupings of synonymous labeled terms, sensitivity improved (40.9%). More of the adverse events detected by both methods were detected earlier using DA and grouped (versus individual) terms. With 1939-2004 data, diagnostic properties were similar to those from 1993 to 2004. DA methods using Merck's safety database demonstrate sufficient sensitivity and specificity to be considered for use as an adjunct to conventional signal detection methods.
Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias
2017-07-15
Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is possible and that the present method can provide a novel solution to analyse real-world fNIRS data, filling the gap between real-life testing and functional neuroimaging. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bialas, A.
2004-02-01
It is shown that the method of eliminating the statistical fluctuations from event-by-event analysis proposed recently by Fu and Liu can be rewritten in a compact form involving the generalized factorial moments.
LHCb trigger streams optimization
NASA Astrophysics Data System (ADS)
Derkach, D.; Kazeev, N.; Neychev, R.; Panin, A.; Trofimov, I.; Ustyuzhanin, A.; Vesterinen, M.
2017-10-01
The LHCb experiment stores around 1011 collision events per year. A typical physics analysis deals with a final sample of up to 107 events. Event preselection algorithms (lines) are used for data reduction. Since the data are stored in a format that requires sequential access, the lines are grouped into several output file streams, in order to increase the efficiency of user analysis jobs that read these data. The scheme efficiency heavily depends on the stream composition. By putting similar lines together and balancing the stream sizes it is possible to reduce the overhead. We present a method for finding an optimal stream composition. The method is applied to a part of the LHCb data (Turbo stream) on the stage where it is prepared for user physics analysis. This results in an expected improvement of 15% in the speed of user analysis jobs, and will be applied on data to be recorded in 2017.
NASA Astrophysics Data System (ADS)
Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko
2002-05-01
The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.
ERIC Educational Resources Information Center
Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.
2016-01-01
School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…
Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin
2017-08-04
In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.
Using Boosting Decision Trees in Gravitational Wave Searches triggered by Gamma-ray Bursts
NASA Astrophysics Data System (ADS)
Zuraw, Sarah; LIGO Collaboration
2015-04-01
The search for gravitational wave bursts requires the ability to distinguish weak signals from background detector noise. Gravitational wave bursts are characterized by their transient nature, making them particularly difficult to detect as they are similar to non-Gaussian noise fluctuations in the detector. The Boosted Decision Tree method is a powerful machine learning algorithm which uses Multivariate Analysis techniques to explore high-dimensional data sets in order to distinguish between gravitational wave signal and background detector noise. It does so by training with known noise events and simulated gravitational wave events. The method is tested using waveform models and compared with the performance of the standard gravitational wave burst search pipeline for Gamma-ray Bursts. It is shown that the method is able to effectively distinguish between signal and background events under a variety of conditions and over multiple Gamma-ray Burst events. This example demonstrates the usefulness and robustness of the Boosted Decision Tree and Multivariate Analysis techniques as a detection method for gravitational wave bursts. LIGO, UMass, PREP, NEGAP.
Chu, Catherine. J.; Chan, Arthur; Song, Dan; Staley, Kevin J.; Stufflebeam, Steven M.; Kramer, Mark A.
2017-01-01
Summary Background High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. New Method The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. Results We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. Comparison with Existing Method The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Conclusions Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. PMID:27988323
Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.
Lee, Seong-Hun
2014-11-01
There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.
Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi
2017-01-01
Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809
Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo
2017-07-01
This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.
Formal analysis of imprecise system requirements with Event-B.
Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan
2016-01-01
Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.
Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis
NASA Astrophysics Data System (ADS)
Kumar, Ranjan; Ghosh, Achyuta Krishna
2017-04-01
Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.
NASA Astrophysics Data System (ADS)
Haruki, W.; Iseri, Y.; Takegawa, S.; Sasaki, O.; Yoshikawa, S.; Kanae, S.
2016-12-01
Natural disasters caused by heavy rainfall occur every year in Japan. Effective countermeasures against such events are important. In 2015, a catastrophic flood occurred in Kinu river basin, which locates in the northern part of Kanto region. The remarkable feature of this flood event was not only in the intensity of rainfall but also in the spatial characteristics of heavy rainfall area. The flood was caused by continuous overlapping of heavy rainfall area over the Kinu river basin, suggesting consideration of spatial extent is quite important to assess impacts of heavy rainfall events. However, the spatial extent of heavy rainfall events cannot be properly measured through rainfall measurement by rain gauges at observation points. On the other hand, rainfall measurements by radar observations provide spatially and temporarily high resolution rainfall data which would be useful to catch the characteristics of heavy rainfall events. For long term effective countermeasure, extreme heavy rainfall scenario considering rainfall area and distribution is required. In this study, a new method for generating extreme heavy rainfall events using Monte Carlo Simulation has been developed in order to produce extreme heavy rainfall scenario. This study used AMeDAS analyzed precipitation data which is high resolution grid precipitation data made by Japan Meteorological Agency. Depth area duration (DAD) analysis has been conducted to extract extreme rainfall events in the past, considering time and spatial scale. In the Monte Carlo Simulation, extreme rainfall event is generated based on events extracted by DAD analysis. Extreme heavy rainfall events are generated in specific region in Japan and the types of generated extreme heavy rainfall events can be changed by varying the parameter. For application of this method, we focused on Kanto region in Japan. As a result, 3000 years rainfall data are generated. 100 -year probable rainfall and return period of flood in Kinu River Basin (2015) are obtained using generated data. We compared 100-year probable rainfall calculated by this method with other traditional method. New developed method enables us to generate extreme rainfall events considering time and spatial scale and produce extreme rainfall scenario.
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
Markov chains and semi-Markov models in time-to-event analysis.
Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J
2013-10-25
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.
Markov chains and semi-Markov models in time-to-event analysis
Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.
2014-01-01
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062
Comparing data mining methods on the VAERS database.
Banks, David; Woo, Emily Jane; Burwen, Dale R; Perucci, Phil; Braun, M Miles; Ball, Robert
2005-09-01
Data mining may enhance traditional surveillance of vaccine adverse events by identifying events that are reported more commonly after administering one vaccine than other vaccines. Data mining methods find signals as the proportion of times a condition or group of conditions is reported soon after the administration of a vaccine; thus it is a relative proportion compared across vaccines, and not an absolute rate for the condition. The Vaccine Adverse Event Reporting System (VAERS) contains approximately 150 000 reports of adverse events that are possibly associated with vaccine administration. We studied four data mining techniques: empirical Bayes geometric mean (EBGM), lower-bound of the EBGM's 90% confidence interval (EB05), proportional reporting ratio (PRR), and screened PRR (SPRR). We applied these to the VAERS database and compared the agreement among methods and other performance properties, particularly focusing on the vaccine-event combinations with the highest numerical scores in the various methods. The vaccine-event combinations with the highest numerical scores varied substantially among the methods. Not all combinations representing known associations appeared in the top 100 vaccine-event pairs for all methods. The four methods differ in their ranking of vaccine-COSTART pairs. A given method may be superior in certain situations but inferior in others. This paper examines the statistical relationships among the four estimators. Determining which method is best for public health will require additional analysis that focuses on the true alarm and false alarm rates using known vaccine-event associations. Evaluating the properties of these data mining methods will help determine the value of such methods in vaccine safety surveillance. (c) 2005 John Wiley & Sons, Ltd.
A Hierarchical Convolutional Neural Network for vesicle fusion event classification.
Li, Haohan; Mao, Yunxiang; Yin, Zhaozheng; Xu, Yingke
2017-09-01
Quantitative analysis of vesicle exocytosis and classification of different modes of vesicle fusion from the fluorescence microscopy are of primary importance for biomedical researches. In this paper, we propose a novel Hierarchical Convolutional Neural Network (HCNN) method to automatically identify vesicle fusion events in time-lapse Total Internal Reflection Fluorescence Microscopy (TIRFM) image sequences. Firstly, a detection and tracking method is developed to extract image patch sequences containing potential fusion events. Then, a Gaussian Mixture Model (GMM) is applied on each image patch of the patch sequence with outliers rejected for robust Gaussian fitting. By utilizing the high-level time-series intensity change features introduced by GMM and the visual appearance features embedded in some key moments of the fusion process, the proposed HCNN architecture is able to classify each candidate patch sequence into three classes: full fusion event, partial fusion event and non-fusion event. Finally, we validate the performance of our method on 9 challenging datasets that have been annotated by cell biologists, and our method achieves better performances when comparing with three previous methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
SEISMIC SOURCE SCALING AND DISCRIMINATION IN DIVERSE TECTONIC ENVIRONMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, R E; Mayeda, K; Walter, W R
2007-07-10
The objectives of this study are to improve low-magnitude regional seismic discrimination by performing a thorough investigation of earthquake source scaling using diverse, high-quality datasets from varied tectonic regions. Local-to-regional high-frequency discrimination requires an estimate of how earthquakes scale with size. Walter and Taylor (2002) developed the MDAC (Magnitude and Distance Amplitude Corrections) method to empirically account for these effects through regional calibration. The accuracy of these corrections has a direct impact on our ability to identify clandestine explosions in the broad regional areas characterized by low seismicity. Unfortunately our knowledge of source scaling at small magnitudes (i.e., m{sub b}more » < {approx}4.0) is poorly resolved. It is not clear whether different studies obtain contradictory results because they analyze different earthquakes, or because they use different methods. Even in regions that are well studied, such as test sites or areas of high seismicity, we still rely on empirical scaling relations derived from studies taken from half-way around the world at inter-plate regions. We investigate earthquake sources and scaling from different tectonic settings, comparing direct and coda wave analysis methods. We begin by developing and improving the two different methods, and then in future years we will apply them both to each set of earthquakes. Analysis of locally recorded, direct waves from events is intuitively the simplest way of obtaining accurate source parameters, as these waves have been least affected by travel through the earth. But there are only a limited number of earthquakes that are recorded locally, by sufficient stations to give good azimuthal coverage, and have very closely located smaller earthquakes that can be used as an empirical Green's function (EGF) to remove path effects. In contrast, coda waves average radiation from all directions so single-station records should be adequate, and previous work suggests that the requirements for the EGF event are much less stringent. We can study more earthquakes using the coda-wave methods, while using direct wave methods for the best recorded subset of events so as to investigate any differences between the results of the two approaches. Finding 'perfect' EGF events for direct wave analysis is difficult, as is ascertaining the quality of a particular EGF event. We develop a multi-taper method to obtain time-domain source-time-functions by frequency division. If an earthquake and EGF event pair are able to produce a clear, time-domain source pulse then we accept the EGF event. We then model the spectral (amplitude) ratio to determine source parameters from both direct P and S waves. We use the well-recorded sequence of aftershocks of the M5 Au Sable Forks, NY, earthquake to test the method and also to obtain some of the first accurate source parameters for small earthquakes in eastern North America. We find that the stress drops are high, confirming previous work suggesting that intraplate continental earthquakes have higher stress drops than events at plate boundaries. We simplify and improve the coda wave analysis method by calculating spectral ratios between different sized earthquakes. We first compare spectral ratio performance between local and near-regional S and coda waves in the San Francisco Bay region for moderate-sized events. The average spectral ratio standard deviations using coda are {approx}0.05 to 0.12, roughly a factor of 3 smaller than direct S-waves for 0.2 < f < 15.0 Hz. Also, direct wave analysis requires collocated pairs of earthquakes whereas the event-pairs (Green's function and target events) can be separated by {approx}25 km for coda amplitudes without any appreciable degradation. We then apply coda spectral ratio method to the 1999 Hector Mine mainshock (M{sub w} 7.0, Mojave Desert) and its larger aftershocks. We observe a clear departure from self-similarity, consistent with previous studies using similar regional datasets.« less
NASA Technical Reports Server (NTRS)
Acker, James G.; Uz, Stephanie Schollaert; Shen, Suhung; Leptoukh, Gregory G.
2010-01-01
Application of appropriate spatial averaging techniques is crucial to correct evaluation of ocean color radiometric data, due to the common log-normal or mixed log-normal distribution of these data. Averaging method is particularly crucial for data acquired in coastal regions. The effect of averaging method was markedly demonstrated for a precipitation-driven event on the U.S. Northeast coast in October-November 2005, which resulted in export of high concentrations of riverine colored dissolved organic matter (CDOM) to New York and New Jersey coastal waters over a period of several days. Use of the arithmetic mean averaging method created an inaccurate representation of the magnitude of this event in SeaWiFS global mapped chl a data, causing it to be visualized as a very large chl a anomaly. The apparent chl a anomaly was enhanced by the known incomplete discrimination of CDOM and phytoplankton chlorophyll in SeaWiFS data; other data sources enable an improved characterization. Analysis using the geometric mean averaging method did not indicate this event to be statistically anomalous. Our results predicate the necessity of providing the geometric mean averaging method for ocean color radiometric data in the Goddard Earth Sciences DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni).
ERIC Educational Resources Information Center
Brookhart, Susan M.; And Others
1997-01-01
Process Analysis is described as a method for identifying and measuring the probability of events that could cause the failure of a program, resulting in a cause-and-effect tree structure of events. The method is illustrated through the evaluation of a pilot instructional program at an elementary school. (SLD)
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Parameterization of synoptic weather systems in the South Atlantic Bight for modeling applications
NASA Astrophysics Data System (ADS)
Wu, Xiaodong; Voulgaris, George; Kumar, Nirnimesh
2017-10-01
An event based, long-term, climatological analysis is presented that allows the creation of coastal ocean atmospheric forcing on the coastal ocean that preserves both frequency of occurrence and event time history. An algorithm is developed that identifies individual storm event (cold fronts, warm fronts, and tropical storms) from meteorological records. The algorithm has been applied to a location along the South Atlantic Bight, off South Carolina, an area prone to cyclogenesis occurrence and passages of atmospheric fronts. Comparison against daily weather maps confirms that the algorithm is efficient in identifying cold fronts and warm fronts, while the identification of tropical storms is less successful. The average state of the storm events and their variability are represented by the temporal evolution of atmospheric pressure, air temperature, wind velocity, and wave directional spectral energy. The use of uncorrected algorithm-detected events provides climatologies that show a little deviation from those derived using corrected events. The effectiveness of this analysis method is further verified by numerically simulating the wave conditions driven by the characteristic wind forcing and comparing the results with the wave climatology that corresponds to each storm type. A high level of consistency found in the comparison indicates that this analysis method can be used for accurately characterizing event-based oceanic processes and long-term storm-induced morphodynamic processes on wind-dominated coasts.
NASA Astrophysics Data System (ADS)
Solano, ErickaAlinne; Hjorleifsdottir, Vala; Perez-Campos, Xyoli
2015-04-01
A large subset of seismic events do not have impulsive arrivals, such as low frequency events in volcanoes, earthquakes in the shallow part of the subduction interface and further down dip from the traditional seismogenic part, glacial events, volcanic and non-volcanic tremors and landslides. A suite of methods can be used to detect these non-impulsive events. One of this methods is the full-waveform detection based on time reversal methods (Solano, et al , submitted to GJI). The method uses continuous observed seismograms, together with Greens functions and moment tensor responses calculated for an arbitrary 3D structure. This method was applied to the 2012 Ometepec-Pinotepa Nacional earthquake sequence in Guerrero, Mexico. During the span time of the study, we encountered three previously unknown events. One of this events was an impulsive earthquake in the Ometepec area, that only has clear arrivals on three stations and was therefore not located and reported by the SSN. The other two events are previously undetected events, very depleted in high frequencies, that occurred far outside the search area. A very rough estimate gives the location of this two events in the portion of the East Pacific Rise around 9 N. These two events are detected despite their distance from the search area, due to favorable move-out on the array of the Mexican National Seismological Service network (SSN). We are expanding the study area to the EPR and to a larger period of time, with the objective of finding more events in that region. We will present an analysis of the newly detected events, as well as any further findings at the meeting.
Du, Xiuquan; Hu, Changlin; Yao, Yu; Sun, Shiwei; Zhang, Yanping
2017-12-12
In bioinformatics, exon skipping (ES) event prediction is an essential part of alternative splicing (AS) event analysis. Although many methods have been developed to predict ES events, a solution has yet to be found. In this study, given the limitations of machine learning algorithms with RNA-Seq data or genome sequences, a new feature, called RS (RNA-seq and sequence) features, was constructed. These features include RNA-Seq features derived from the RNA-Seq data and sequence features derived from genome sequences. We propose a novel Rotation Forest classifier to predict ES events with the RS features (RotaF-RSES). To validate the efficacy of RotaF-RSES, a dataset from two human tissues was used, and RotaF-RSES achieved an accuracy of 98.4%, a specificity of 99.2%, a sensitivity of 94.1%, and an area under the curve (AUC) of 98.6%. When compared to the other available methods, the results indicate that RotaF-RSES is efficient and can predict ES events with RS features.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
A Bayes linear Bayes method for estimation of correlated event rates.
Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim
2013-12-01
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Sciotto, M.; Rowe, C. A.; Cannata, A.; Arrowsmith, S.; Privitera, E.; Gresta, S.
2011-12-01
The current eruption of Mount Etna, which began in January, 2011, has produced numerous energetic episodes of lava fountaining, which have bee recorded by the INGV seismic and acoustic sensors located on and around the volcano. The source of these events was the pit crater on the east flank of the Southeast crater of Etna. Simultaneously, small levels of activity were noted in the Bocca Nuova as well, prior to its lava fountaining activity. We will present an analysis of seismic and acoustic signals related to the 2011 activity wherein we apply the method of subspace detection to determine whether the source exhibits a temporal evolution within or between fountaining events, or otherwise produces repeating, classifiable events occurring through the continuous explosive degassing. We will examine not only the raw waveforms, but also spectral variations in time as well as time-varying statistical functions such as signal skewness and kurtosis. These results will be compared to straightforward cross-correlation analysis. In addition to classification performance, the subspace method has promise to outperform standard STA/LTA methods for real-time event detection in cases where similar events can be expected.
Using social media for disaster emergency management
NASA Astrophysics Data System (ADS)
Wang, Y. D.; Wang, T.; Ye, X. Y.; Zhu, J. Q.; Lee, J.
2016-06-01
Social media have become a universal phenomenon in our society (Wang et al., 2012). As a new data source, social media have been widely used in knowledge discovery in fields related to health (Jackson et al., 2014), human behaviour (Lee, 2014), social influence (Hong, 2013), and market analysis (Hanna et al., 2011). In this paper, we report a case study of the 2012 Beijing Rainstorm to investigate how emergency information was timely distributed using social media during emergency events. We present a classification and location model for social media text streams during emergency events. This model classifies social media text streams based on their topical contents. Integrated with a trend analysis, we show how Sina-Weibo fluctuated during emergency events. Using a spatial statistical analysis method, we found that the distribution patterns of Sina-Weibo were related to the emergency events but varied among different topics. This study helps us to better understand emergency events so that decision-makers can act on emergencies in a timely manner. In addition, this paper presents the tools, methods, and models developed in this study that can be used to work with text streams from social media in the context of disaster management.
NASA Astrophysics Data System (ADS)
Schaefli, B.; Maraun, D.; Holschneider, M.
2007-12-01
Extreme hydrological events are often triggered by exceptional co-variations of the relevant hydrometeorological processes and in particular by exceptional co-oscillations at various temporal scales. Wavelet and cross wavelet spectral analysis offers promising time-scale resolved analysis methods to detect and analyze such exceptional co-oscillations. This paper presents the state-of-the-art methods of wavelet spectral analysis, discusses related subtleties, potential pitfalls and recently developed solutions to overcome them and shows how wavelet spectral analysis, if combined to a rigorous significance test, can lead to reliable new insights into hydrometeorological processes for real-world applications. The presented methods are applied to detect potentially flood triggering situations in a high Alpine catchment for which a recent re-estimation of design floods encountered significant problems simulating the observed high flows. For this case study, wavelet spectral analysis of precipitation, temperature and discharge offers a powerful tool to help detecting potentially flood producing meteorological situations and to distinguish between different types of floods with respect to the prevailing critical hydrometeorological conditions. This opens very new perspectives for the analysis of model performances focusing on the occurrence and non-occurrence of different types of high flow events. Based on the obtained results, the paper summarizes important recommendations for future applications of wavelet spectral analysis in hydrology.
Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A
2017-02-01
High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.
[Impact of water pollution risk in water transfer project based on fault tree analysis].
Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing
2009-09-15
The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis
Gong, Xiajing; Hu, Meng
2018-01-01
Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Interpreting Significant Discrete-Time Periods in Survival Analysis.
ERIC Educational Resources Information Center
Schumacker, Randall E.; Denson, Kathleen B.
Discrete-time survival analysis is a new method for educational researchers to employ when looking at the timing of certain educational events. Previous continuous-time methods do not allow for the flexibility inherent in a discrete-time method. Because both time-invariant and time-varying predictor variables can now be used, the interaction of…
Multi-Phenomenological Analysis of the 12 August 2015 Tianjin, China Chemical Explosion
NASA Astrophysics Data System (ADS)
Pasyanos, M.; Kim, K.; Park, J.; Stump, B. W.; Hayward, C.; Che, I. Y.; Zhao, L.; Myers, S. C.
2016-12-01
We perform a multi-phenomenological analysis of the massive near-surface chemical explosions that occurred in Tianjin, China on 12 August 2015. A recent assessment of these events was performed by Zhao et al. (2016) using local (< 100 km) seismic data. This study considers a regional assessment of the same sequence in the absence of having any local data. We provide additional insight by combining regional seismic analysis with the use of infrasound signals and an assessment of the event crater. Event locations using infrasound signals recorded at Korean and IMS arrays are estimated based on the Bayesian Infrasonic Source Location (BISL) method (Modrak et al., 2010), and improved with azimuthal corrections using a raytracing (Blom and Waxler, 2012) and the Ground-to-Space (G2S) atmospheric models (Drob et al., 2003). The location information provided from the infrasound signals is then merged with the regional seismic arrivals to produce a joint event location. The yields of the events are estimated from seismic and infrasonic observations. Seismic waveform envelope method (Pasyanos et al., 2012) including the free surface effect (Pasyanos and Ford, 2015) is applied to regional seismic signals. Waveform inversion method (Kim and Rodgers, 2016) is used for infrasound signals. A combination of the seismic and acoustic signals can provide insights on the energy partitioning and break the tradeoffs between the yield and the depth/height of explosions, resulting in a more robust estimation of event yield. The yield information from the different phenomenologies are combined through the use of likelihood functions.
Photographic Analysis Technique for Assessing External Tank Foam Loss Events
NASA Technical Reports Server (NTRS)
Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.
2001-01-01
A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Statistical analysis of hydrodynamic cavitation events
NASA Astrophysics Data System (ADS)
Gimenez, G.; Sommer, R.
1980-10-01
The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.
ERIC Educational Resources Information Center
Maunder, Robert G.; Peladeau, Nathalie; Savage, Diane; Lancee, William J.
2010-01-01
Objective: We investigated the prevalence of childhood adversity among healthcare workers and if such experiences affect responses to adult life stress. Methods: A secondary analysis was conducted of a 2003 study of 176 hospital-based healthcare workers, which surveyed lifetime traumatic events, recent life events, psychological distress, coping,…
Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets.
Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo
2017-08-08
The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms.
Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari
2002-01-01
The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.
NASA Astrophysics Data System (ADS)
Folesky, J.; Kummerow, J.; Shapiro, S. A.; Asanuma, H.; Häring, M. O.
2015-12-01
The Emprirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's Function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at different stations of the observation systems produces information on the rupture process of the larger event based on the observation of the directivity effect and on changing RSTFs complexities.The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
Insertion of coherence requests for debugging a multiprocessor
Blumrich, Matthias A.; Salapura, Valentina
2010-02-23
A method and system are disclosed to insert coherence events in a multiprocessor computer system, and to present those coherence events to the processors of the multiprocessor computer system for analysis and debugging purposes. The coherence events are inserted in the computer system by adding one or more special insert registers. By writing into the insert registers, coherence events are inserted in the multiprocessor system as if they were generated by the normal coherence protocol. Once these coherence events are processed, the processing of coherence events can continue in the normal operation mode.
Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics
NASA Astrophysics Data System (ADS)
Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu
2007-11-01
In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.
A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, A. S. T.; Chapman, J. D.; Thomson, M. A.
Tmore » his paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of ν μ charged-current interactions with an oscillatory dependence on L ν / E ν , where L ν is the neutrino propagation distance and E mrow is="true"> ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the L ν / E ν resolution varies significantly from event to event. he precision of the oscillation measurement can be improved by incorporating information on L ν / E ν resolution into the oscillation analysis. In the analysis presented here, a Bayesian technique is used to estimate the L ν / E ν resolution of observed atmospheric neutrinos on an event-by-event basis. By separating the events into bins of L ν / E ν resolution in the oscillation analysis, a significant improvement in oscillation sensitivity can be achieved.« less
Adaptive Sampling using Support Vector Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Mandelli; C. Smith
2012-11-01
Reliability/safety analysis of stochastic dynamic systems (e.g., nuclear power plants, airplanes, chemical plants) is currently performed through a combination of Event-Tress and Fault-Trees. However, these conventional methods suffer from certain drawbacks: • Timing of events is not explicitly modeled • Ordering of events is preset by the analyst • The modeling of complex accident scenarios is driven by expert-judgment For these reasons, there is currently an increasing interest into the development of dynamic PRA methodologies since they can be used to address the deficiencies of conventional methods listed above.
Dynamic Analysis and Research on Environmental Pollution in China from 1992 to 2014
NASA Astrophysics Data System (ADS)
Sun, Fei; Yuan, Peng; Li, Huiting; Zhang, Moli
2018-01-01
The regular pattern of development of the environmental pollution events was analyzed from the perspective of statistical analysis of pollution events in recent years. The Moran, s I and spatial center-of-gravity shift curve of China, s environmental emergencies were calculated by ARCGIS software. And the method is global spatial analysis and spatial center of gravity shift. The results showed that the trend of China, s environmental pollution events from 1992 to 2014 was the first dynamic growth and then gradually reduced. Environmental pollution events showed spatial aggregation distribution in 1992-1994, 2001-2006, 2008-2014, and the rest of year was a random distribution of space. There were two stages in China, s environmental pollution events: The transition to the southwest from 1992 to 2006 and the transition to the northeast from the year of 2006 to 2014.
NASA Technical Reports Server (NTRS)
Hejduk, M. D.
2016-01-01
Provide a response to MOWG action item 1410-01: Analyze close approaches which have required mission team action on short notice. Determine why the approaches were identified later in the process than most other events. Method: Performed an analysis to determine whether there is any correlation between late notice event identification and space weather, sparse tracking, or high drag objects, which would allow preventive action to be taken Examined specific late notice events identified by missions as problematic to try to identify root cause and attempt to relate them to the correlation analysis.
Simulating Mission Command for Planning and Analysis
2015-06-01
mission plan. 14. SUBJECT TERMS Mission Planning, CPM , PERT, Simulation, DES, Simkit, Triangle Distribution, Critical Path 15. NUMBER OF...Battalion Task Force CO Company CPM Critical Path Method DES Discrete Event Simulation FA BAT Field Artillery Battalion FEL Future Event List FIST...management tools that can be utilized to find the critical path in military projects. These are the Critical Path Method ( CPM ) and the Program Evaluation and
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Scientific Visualization to Study Flux Transfer Events at the Community Coordinated Modeling Center
NASA Technical Reports Server (NTRS)
Rastatter, Lutz; Kuznetsova, Maria M.; Sibeck, David G.; Berrios, David H.
2011-01-01
In this paper we present results of modeling of reconnection at the dayside magnetopause with subsequent development of flux transfer event signatures. The tools used include new methods that have been added to the suite of visualization methods that are used at the Community Coordinated Modeling Center (CCMC). Flux transfer events result from localized reconnection that connect magnetosheath magnetic field and plasma with magnetospheric fields and plasma and results in flux rope structures that span the dayside magnetopause. The onset of flux rope formation and the three-dimensional structure of flux ropes are studied as they have been modeled by high-resolution magnetohydrodynamic simulations of the dayside magnetosphere of the Earth. We show that flux transfer events are complex three-dimensional structures that require modern visualization and analysis techniques. Two suites of visualization methods are presented and we demonstrate the usefulness of those methods through the CCMC web site to the general science user.
Cross-National Analysis of Islamic Fundamentalism
2016-01-20
attitudes, and was fully involved in activities concerning questionnaire design including a new experimental design in the survey, pilot testing, and...possible collaboration with the research design of the panel survey in Tunisia. • Data analysis: Analyses of religious fundamentalism, women’s dress, trust...the Event History Calendar and the best methods to ask about knowledge and experience of past events. The group designed a series of cognitive
Results from the First Two Flights of the Static Computer Memory Integrity Testing Experiment
NASA Technical Reports Server (NTRS)
Hancock, Thomas M., III
1999-01-01
This paper details the scientific objectives, experiment design, data collection method, and post flight analysis following the first two flights of the Static Computer Memory Integrity Testing (SCMIT) experiment. SCMIT is designed to detect soft-event upsets in passive magnetic memory. A soft-event upset is a change in the logic state of active or passive forms of magnetic memory, commonly referred to as a "Bitflip". In its mildest form a soft-event upset can cause software exceptions, unexpected events, start spacecraft safeing (ending data collection) or corrupted fault protection and error recovery capabilities. In it's most severe form loss of mission or spacecraft can occur. Analysis after the first flight (in 1991 during STS-40) identified possible soft-event upsets to 25% of the experiment detectors. Post flight analysis after the second flight (in 1997 on STS-87) failed to find any evidence of soft-event upsets. The SCMIT experiment is currently scheduled for a third flight in December 1999 on STS-101.
NASA Technical Reports Server (NTRS)
Totman, Peter D. (Inventor); Everton, Randy L. (Inventor); Egget, Mark R. (Inventor); Macon, David J. (Inventor)
2007-01-01
A method and apparatus for detecting and determining event characteristics such as, for example, the material failure of a component, in a manner which significantly reduces the amount of data collected. A sensor array, including a plurality of individual sensor elements, is coupled to a programmable logic device (PLD) configured to operate in a passive state and an active state. A triggering event is established such that the PLD records information only upon detection of the occurrence of the triggering event which causes a change in state within one or more of the plurality of sensor elements. Upon the occurrence of the triggering event, the change in state of the one or more sensor elements causes the PLD to record in memory which sensor element detected the event and at what time the event was detected. The PLD may be coupled with a computer for subsequent downloading and analysis of the acquired data.
A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network
1980-07-08
to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for
Joint Modeling Approach for Semicompeting Risks Data with Missing Nonterminal Event Status
Hu, Chen; Tsodikov, Alex
2014-01-01
Semicompeting risks data, where a subject may experience sequential non-terminal and terminal events, and the terminal event may censor the non-terminal event but not vice versa, are widely available in many biomedical studies. We consider the situation when a proportion of subjects’ non-terminal events is missing, such that the observed data become a mixture of “true” semicompeting risks data and partially observed terminal event only data. An illness-death multistate model with proportional hazards assumptions is proposed to study the relationship between non-terminal and terminal events, and provide covariate-specific global and local association measures. Maximum likelihood estimation based on semiparametric regression analysis is used for statistical inference, and asymptotic properties of proposed estimators are studied using empirical process and martingale arguments. We illustrate the proposed method with simulation studies and data analysis of a follicular cell lymphoma study. PMID:24430204
Rogers, Jennifer K; Pocock, Stuart J; McMurray, John J V; Granger, Christopher B; Michelson, Eric L; Östergren, Jan; Pfeffer, Marc A; Solomon, Scott D; Swedberg, Karl; Yusuf, Salim
2014-01-01
Heart failure is characterized by recurrent hospitalizations, but often only the first event is considered in clinical trial reports. In chronic diseases, such as heart failure, analysing all events gives a more complete picture of treatment benefit. We describe methods of analysing repeat hospitalizations, and illustrate their value in one major trial. The Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity (CHARM)-Preserved study compared candesartan with placebo in 3023 patients with heart failure and preserved systolic function. The heart failure hospitalization rates were 12.5 and 8.9 per 100 patient-years in the placebo and candesartan groups, respectively. The repeat hospitalizations were analysed using the Andersen-Gill, Poisson, and negative binomial methods. Death was incorporated into analyses by treating it as an additional event. The win ratio method and a method that jointly models hospitalizations and mortality were also considered. Using repeat events gave larger treatment benefits than time to first event analysis. The negative binomial method for the composite of recurrent heart failure hospitalizations and cardiovascular death gave a rate ratio of 0.75 [95% confidence interval (CI) 0.62-0.91, P = 0.003], whereas the hazard ratio for time to first heart failure hospitalization or cardiovascular death was 0.86 (95% CI 0.74-1.00, P = 0.050). In patients with preserved EF, candesartan reduces the rate of admissions for worsening heart failure, to a greater extent than apparent from analysing only first hospitalizations. Recurrent events should be routinely incorporated into the analysis of future clinical trials in heart failure. © 2013 The Authors. European Journal of Heart Failure © 2013 European Society of Cardiology.
NASA Astrophysics Data System (ADS)
Guo, Enliang; Zhang, Jiquan; Si, Ha; Dong, Zhenhua; Cao, Tiehua; Lan, Wu
2017-10-01
Environmental changes have brought about significant changes and challenges to water resources and management in the world; these include increasing climate variability, land use change, intensive agriculture, and rapid urbanization and industrial development, especially much more frequency extreme precipitation events. All of which greatly affect water resource and the development of social economy. In this study, we take extreme precipitation events in the Midwest of Jilin Province as an example; daily precipitation data during 1960-2014 are used. The threshold of extreme precipitation events is defined by multifractal detrended fluctuation analysis (MF-DFA) method. Extreme precipitation (EP), extreme precipitation ratio (EPR), and intensity of extreme precipitation (EPI) are selected as the extreme precipitation indicators, and then the Kolmogorov-Smirnov (K-S) test is employed to determine the optimal probability distribution function of extreme precipitation indicators. On this basis, copulas connect nonparametric estimation method and the Akaike Information Criterion (AIC) method is adopted to determine the bivariate copula function. Finally, we analyze the characteristics of single variable extremum and bivariate joint probability distribution of the extreme precipitation events. The results show that the threshold of extreme precipitation events in semi-arid areas is far less than that in subhumid areas. The extreme precipitation frequency shows a significant decline while the extreme precipitation intensity shows a trend of growth; there are significant differences in spatiotemporal of extreme precipitation events. The spatial variation trend of the joint return period gets shorter from the west to the east. The spatial distribution of co-occurrence return period takes on contrary changes and it is longer than the joint return period.
It's Deja Vu All over Again: Using Multiple-Spell Discrete-Time Survival Analysis.
ERIC Educational Resources Information Center
Willett, John B.; Singer, Judith D.
1995-01-01
The multiple-spell discrete-time survival analysis method is introduced and illustrated using longitudinal data on exit from and reentry into the teaching profession. The method is applicable to many educational problems involving the sequential occurrence of disparate events or episodes. (SLD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Estimating Rupture Directivity of Aftershocks of the 2014 Mw8.1 Iquique Earthquake, Northern Chile
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn; Timann, Frederik; Shapiro, Serge
2017-04-01
The 2014 Mw8.1 Iquique earthquake was accompanied by numerous fore- and aftershocks of magnitudes up to M ˜ 7.6. While the rupture processes of the main event and its largest aftershock were already analysed in great detail, this study focusses on the rupture processes of about 230 smaller aftershocks that occurred during the first two days after the main event. Since the events are of magnitudes 4.0 ≤ M ≤ 6.5 it is not trivial which method is most suitable. Thus we apply and compare here three different approaches attempting to extract a possible rupture directivity for each single event. The seismic broadband recordings of the Integrated Plate Boundary Observatory Chile (IPOC) provide an excellent database for our analysis. Their high sampling rate (100 Hz) and a well distributed station selection that cover an aperture of about 180 ° are a great advantage for a thorough directivity analysis. First, we apply a P wave polarization analysis (PPA) where we reconstruct the direction of the incoming wave-field by covariance analysis of the first particle motions. Combined with a sliding time window the results from different stations are capable of identifying first the hypocentre of the events and also a migration of the rupture front, if the event is of unilateral character. A second approach is the back projection imaging (BPI) technique, which illuminates the rupture path by back-projecting the recorded seismic energy to its source. A propagating rupture front would be reconstructed from the migration of the zone of high constructive amplitude stacks. In a third step we apply the empirical Green's function (EGF) method, where events of high waveform similarity, hence co-located and of similar mechanisms, are selected in order to use the smaller event as the Green's function of the larger event. This approach results in an estimated source time function, which is compared station wise and whose azimuthal variations are analysed for complexities and directivity.
NASA Astrophysics Data System (ADS)
Folesky, Jonas; Kummerow, Jörn
2015-04-01
The Empirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at all stations of the observation systems produces information on the rupture process of the event based on an apparent directivity effect and possible changes in the RSTFs complexities. The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0 < ML < 3.5 with event pairs of adequate magnitude difference for EGF analysis. The data has sufficient quality to analyse events with magnitudes down to ML = 0.0 for an apparent directivity effect although the approximate rupture duration for those events is of only a few milliseconds. The dataset shows a number of multiplets where fault plane solutions are known from earlier studies. Using the EGF method we compute rupture orientations for about 190 event pairs and compare them to the fault plane solutions of the multiplets. For the majority of events we observe a good consistency between the rupture direction found there and one of the previously determined nodal planes from fault plane solutions. In combination this resolves the fault plane ambiguity. Furthermore the rupture direction fitting yields estimates for projections of the rupture velocity on the horizontal plane. They seem to vary between the multiplets in the reservoir from 0.3 to 0.7 times the S-wave velocity. To our knowledge source characterization by EGF analysis has not yet been introduced to microseismic reservoirs with the data quality found in Basel. Our results show that EGF analysis can provide valuable additional insights on the distribution of rupture properties within the reservoir.
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Gene context analysis in the Integrated Microbial Genomes (IMG) data management system.
Mavromatis, Konstantinos; Chu, Ken; Ivanova, Natalia; Hooper, Sean D; Markowitz, Victor M; Kyrpides, Nikos C
2009-11-24
Computational methods for determining the function of genes in newly sequenced genomes have been traditionally based on sequence similarity to genes whose function has been identified experimentally. Function prediction methods can be extended using gene context analysis approaches such as examining the conservation of chromosomal gene clusters, gene fusion events and co-occurrence profiles across genomes. Context analysis is based on the observation that functionally related genes are often having similar gene context and relies on the identification of such events across phylogenetically diverse collection of genomes. We have used the data management system of the Integrated Microbial Genomes (IMG) as the framework to implement and explore the power of gene context analysis methods because it provides one of the largest available genome integrations. Visualization and search tools to facilitate gene context analysis have been developed and applied across all publicly available archaeal and bacterial genomes in IMG. These computations are now maintained as part of IMG's regular genome content update cycle. IMG is available at: http://img.jgi.doe.gov.
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
Post-blasting seismicity in Rudna copper mine, Poland - source parameters analysis.
NASA Astrophysics Data System (ADS)
Caputa, Alicja; Rudziński, Łukasz; Talaga, Adam
2017-04-01
The really important hazard in Polish copper mines is high seismicity and corresponding rockbursts. Many methods are used to reduce the seismic hazard. Among others the most effective is preventing blasting in potentially hazardous mining panels. The method is expected to provoke small moderate tremors (up to M2.0) and reduce in this way a stress accumulation in the rockmass. This work presents an analysis, which deals with post-blasting events in Rudna copper mine, Poland. Using the Full Moment Tensor (MT) inversion and seismic spectra analysis, we try to find some characteristic features of post blasting seismic sources. Source parameters estimated for post-blasting events are compared with the parameters of not-provoked mining events that occurred in the vicinity of the provoked sources. Our studies show that focal mechanisms of events which occurred after blasts have similar MT decompositions, namely are characterized by a quite strong isotropic component as compared with the isotropic component of not-provoked events. Also source parameters obtained from spectral analysis show that provoked seismicity has a specific source physics. Among others, it is visible from S to P wave energy ratio, which is higher for not-provoked events. The comparison of all our results reveals a three possible groups of sources: a) occurred just after blasts, b) occurred from 5min to 24h after blasts and c) not-provoked seismicity (more than 24h after blasting). Acknowledgements: This work was supported within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.
Austin, Peter C; Schuster, Tibor; Platt, Robert W
2015-10-15
Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
An energy ratio feature extraction method for optical fiber vibration signal
NASA Astrophysics Data System (ADS)
Sheng, Zhiyong; Zhang, Xinyan; Wang, Yanping; Hou, Weiming; Yang, Dan
2018-03-01
The intrusion events in the optical fiber pre-warning system (OFPS) are divided into two types which are harmful intrusion event and harmless interference event. At present, the signal feature extraction methods of these two types of events are usually designed from the view of the time domain. However, the differences of time-domain characteristics for different harmful intrusion events are not obvious, which cannot reflect the diversity of them in detail. We find that the spectrum distribution of different intrusion signals has obvious differences. For this reason, the intrusion signal is transformed into the frequency domain. In this paper, an energy ratio feature extraction method of harmful intrusion event is drawn on. Firstly, the intrusion signals are pre-processed and the power spectral density (PSD) is calculated. Then, the energy ratio of different frequency bands is calculated, and the corresponding feature vector of each type of intrusion event is further formed. The linear discriminant analysis (LDA) classifier is used to identify the harmful intrusion events in the paper. Experimental results show that the algorithm improves the recognition rate of the intrusion signal, and further verifies the feasibility and validity of the algorithm.
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
NASA Astrophysics Data System (ADS)
Bonnet, José; Fradet, Thibault; Traversa, Paola; Tuleau-Malot, Christine; Reynaud-Bouret, Patricia; Laloe, Thomas; Manchuel, Kevin
2014-05-01
In metropolitan France the deformation rates are slow, implying low to moderate seismic activity. Therefore, earthquakes observed during the instrumental period (since 1962), and associated catalogs, cannot be representative of the seismic cycle for the French metropolitan territory. In such context it is necessary, when performing seismic hazard studies, to consider historical seismic data in order to extend the observation period and to be more representative of the seismogenic behavior of geological structures. The French macroseismic database SisFrance is jointly developed by EDF (Electricité de France), BRGM (Bureau de Recherche Géologique et Minière) and IRSN (Institut de Radioprotection et Sureté Nucléaire). It contains more than 6,000 events inventoried between 217 BC and 2007 and more than 100,000 macroseismic observations. SisFrance is the reference macroseismic database for metropolitan France. The aim of this study is to determine, over the whole catalog, the completeness periods for different epicentral intensity (Iepc) classes≥IV. Two methods have been used: 1) the method of Albarello et al. [2001], which has been adapted to best suit the French catalog, and 2) a mathematical method based on change points estimation, proposed by Muggeo et al. [2003], which has been adapted to the analysis of seismic datasets. After a brief theoretical description, both methods are tested and validated using synthetic catalogs, before being applied to the French catalog. The results show that completeness periods estimated using these two methods are coherent with each other for events with Iepc ≥IV (1876 using Albarello et al. [2001] method and 1872 using Muggeo et al. [2003] method) and events with Iepc ≥V (1852 using Albarello et al. [2001] method and 1855 using Muggeo et al. [2003] method). Larger differences in estimated completeness period appear when considering events with Iepc ≥VI (around 30 years difference) and events with Iepc ≥VII (around 50 years difference). These could be explained (1) by the differences in the way each method approaches the data; Muggeo et al. [2003] method estimates all change points within data series, whereas the method of Albarello et al. [2001] focus on the last one, and (2) by a more limited number of data for these epicentral intensity classes (2056 events with Iepc ≥IV and 1252 events with Iepc ≥V vs. 486 events with Iepc ≥VI and 199 events with Iepc ≥VII). Results obtained for epicentral intensity classes greater than VIII are considered not reliable due to the short number of existing data (around 30 events). The completeness periods determined in this study are discussed in the light of their contemporary historical context, and in particular of the evolution of the information available from historical archives since the 17th century.
NASA Astrophysics Data System (ADS)
Labrador, A. W.; Sollitt, L. S.; Cohen, C.; Cummings, A. C.; Leske, R. A.; Mason, G. M.; Mewaldt, R. A.; Stone, E. C.; von Rosenvinge, T. T.; Wiedenbeck, M. E.
2017-12-01
We have estimated mean high-energy ionic charge states of solar energetic particles (SEPs) using the Sollitt et al. (2008) method. The method applies to abundant elements (e.g. N, O, Ne, Mg, Si, and Fe) in SEP events at the energy ranges covered by the STEREO/LET instrument (e.g. 2.7-70 MeV/nuc for Fe) and the ACE/SIS instrument (e.g. 11-168 MeV/nuc for Fe). The method starts by fitting SEP time-intensity profiles during the decay phase of a given, large SEP event in order to obtain energy-dependent decay times. The mean charge state for each element is estimated from the relationship between the energy dependence of its decay times to that for selected calibration references. For simultaneous estimates among multiple elements, we assume a common rigidity dependence across all elements. Earlier calculations by Sollitt et al. incorporated helium time intensity profile fits with an assumed charge state of 2. Subsequent analysis dropped helium as a reference element, for simplicity, but we have recently reincorporated He for calibration, from either STEREO/LET or ACE/SIS data, combined with C as an additional reference element with an assumed mean charge state of 5.9. For this presentation, we will present validation of the reanalysis using data from the 8 March 2012 SEP event in ACE data and the 28 September 2012 event in STEREO data. We will also introduce additional low-energy He from publicly available ACE/ULEIS and STEREO/SIT data, which should further constrain the charge state calibration. Better charge state calibration could yield more robust convergence to physical solutions for SEP events for which this method has not previously yielded results. Therefore, we will also present analysis for additional SEP events from 2005 to 2017, and we will investigate conditions for which this method yields or does not yield charge states.
Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.
Gong, Xiajing; Hu, Meng; Zhao, Liang
2018-05-01
Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
1990-02-01
transform the waveforms of this event to those of the Titanial must be a band limited representation of the firing sequence. Therefore, we decided to...design a Wiener filter to transform Pn waveforms of Event Titania4 into those of Event Titanial at all sensors of NORESS. Prior to applying this technique...for transforming the Pn phases of event Titania 4 into those of event Titanial . 28 T’tania4 -* Titania3 Titania3 B5 T’tania4 Titania4 - Titania3
Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto
2015-12-01
Objective:We aimed to gather evidence of the discontinuation rates owing to adverse events or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-analysis and mixed treatment comparison meta analysis(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse events and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse events, whether in pairwise meta-analysis or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-analysis and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse events. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.
Sarntivijai, Sirarat; Xiang, Zuoshuang; Shedden, Kerby A.; Markel, Howard; Omenn, Gilbert S.; Athey, Brian D.; He, Yongqun
2012-01-01
Vaccine adverse events (VAEs) are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE) profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed) inactivated influenza vaccine (TIV) and trivalent live attenuated influenza vaccine (LAIV). Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS). Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone) and 3,707 AE reports for the only LAIV (FluMist). The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE). CODAE detects AEs using Proportional Reporting Ratio (PRR), Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports). These AE terms were classified using the Ontology of Adverse Events (OAE), MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis) present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these two severe adverse events than TIV. In addition, our meta-analysis found that all previously reported positive correlation between GBS and influenza vaccine immunization were based on trivalent influenza vaccines instead of monovalent influenza vaccines. PMID:23209624
Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce
2016-05-01
For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Haneuse, Sebastien; Lee, Kyu Ha
2016-05-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die before experiencing a readmission event within the time frame of interest. Toward resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some nonterminal event (eg, readmission), the occurrence of which is subject to a terminal event (eg, death). Although several statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail several existing approaches that could, in principle, be used to analyze semi-competing risks data, including composite end point and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49 763 Medicare beneficiaries hospitalized between 2011 and 2013 with a principle discharge diagnosis of heart failure. © 2016 American Heart Association, Inc.
On the equivalence of case-crossover and time series methods in environmental epidemiology.
Lu, Yun; Zeger, Scott L
2007-04-01
The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.
Haneuse, Sebastien; Lee, Kyu Ha
2016-01-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die prior to experiencing a readmission event within the timeframe of interest. Towards resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some non-terminal event (e.g. readmission), the occurrence of which is subject to a terminal event (e.g. death). Although a number of statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail a number of existing approaches that could, in principle, be used to analyze semi-competing risks data including composite endpoint and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49,763 Medicare beneficiaries hospitalized between 2011–2013 with a principle discharge diagnosis of heart failure. PMID:27072677
Analysis methods for wind turbine control and electrical system dynamics
NASA Technical Reports Server (NTRS)
Hinrichsen, E. N.
1995-01-01
The integration of new energy technologies into electric power systems requires methods which recognize the full range of dynamic events in both the new generating unit and the power system. Since new energy technologies are initially perceived as small contributors to large systems, little attention is generally paid to system integration, i.e. dynamic events in the power system are ignored. As a result, most new energy sources are only capable of base-load operation, i.e. they have no load following or cycling capability. Wind turbines are no exception. Greater awareness of this implicit (and often unnecessary) limitation is needed. Analysis methods are recommended which include very low penetration (infinite bus) as well as very high penetration (stand-alone) scenarios.
Tai, Bee-Choo; Grundy, Richard G; Machin, David
2010-04-01
In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.
Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin
2016-12-05
Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.
Wen, Shiping; Zeng, Zhigang; Chen, Michael Z Q; Huang, Tingwen
2017-10-01
This paper addresses the issue of synchronization of switched delayed neural networks with communication delays via event-triggered control. For synchronizing coupled switched neural networks, we propose a novel event-triggered control law which could greatly reduce the number of control updates for synchronization tasks of coupled switched neural networks involving embedded microprocessors with limited on-board resources. The control signals are driven by properly defined events, which depend on the measurement errors and current-sampled states. By using a delay system method, a novel model of synchronization error system with delays is proposed with the communication delays and event-triggered control in the unified framework for coupled switched neural networks. The criteria are derived for the event-triggered synchronization analysis and control synthesis of switched neural networks via the Lyapunov-Krasovskii functional method and free weighting matrix approach. A numerical example is elaborated on to illustrate the effectiveness of the derived results.
Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets
Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo
2017-01-01
The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms. DOI: http://dx.doi.org/10.7554/eLife.25818.001 PMID:28786378
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
NASA Astrophysics Data System (ADS)
Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi
2016-04-01
Within the last decade, extreme weather event attribution has emerged as a new field of science and garnered increasing attention from the wider scientific community and the public. Numerous methods have been put forward to determine the contribution of anthropogenic climate change to individual extreme weather events. So far nearly all such analyses were done months after an event has happened. First, we present our newly established method which can assess the fraction of attributable risk (FAR) of a severe weather event due to an external driver in real-time. The method builds on a large ensemble of atmosphere-only GCM/RCM simulations forced by seasonal forecast sea surface temperatures (SSTs). Taking the UK 2013/14 winter floods as an example, we demonstrate that the change in risk for heavy rainfall during the England floods due to anthropogenic climate change is of similar magnitude using either observed or seasonal forecast SSTs. While FAR is assumed to be independent from event-specific dynamic contributions due to anomalous circulation patterns as a first approximation, the risk of an event to occur under current conditions is clearly a function of the state of the atmosphere. The shorter the event, the more it is a result of chaotic internal weather variability. Hence we are interested to (1) attribute the event to thermodynamic and dynamic causes and to (2) establish a sensible time-scale for which we can make a useful and potentially robust attribution statement with regard to event-specific dynamics. Having tested the dynamic response of our model to SST conditions in January 2014, we find that observed SSTs are required to establish a discernible link between anomalous ocean temperatures and the atmospheric circulation over the North Atlantic in general and the UK in particular. However, for extreme events occurring under strongly anomalous SST patterns, associated with known low-frequency climate modes such as El Nino or La Nina, forecast SSTs can provide sufficient guidance to determine the dynamic contribution to the event on the basis of monthly mean values. No such link can be made (North Atlantic/Western Europe region) for shorter time-scales, unless the observed state of the circulation is taken as reference for the model analysis (e.g. Christidis et al. 2014). We present results from our most recent attribution analysis for the December 2015 UK floods (Storm Desmond and Eva), during which we find a robust teleconnection link between Pacific SSTs and North Atlantic Jetstream anomalies. This is true for both experiments, with forecast and observed SSTs. We propose a fast and simple analysis method based on the comparison of current climatological circulation patterns with actual and natural conditions. Alternative methods are discussed and analysed regarding their potential for fast-track attribution of the role of dynamics. Also, we briefly revisit the issue of internal vs forced dynamic contributions.
40 CFR 63.805 - Performance test methods.
Code of Federal Regulations, 2011 CFR
2011-07-01
... alternative method for determining the VHAP content of the coating. In the event of any inconsistency between... Collection of Coating and Ink Samples for VOC Content Analysis by Reference Method 24 and Reference Method... (see § 63.801); (iii) Use any alternative protocol and test method provided they meet either the...
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko
2016-08-15
Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Migration velocity analysis using residual diffraction moveout: a real-data example
NASA Astrophysics Data System (ADS)
Gonzalez, Jaime A. C.; de Figueiredo, José J. S.; Coimbra, Tiago A.; Schleicher, Jörg; Novais, Amélia
2016-08-01
Unfocused seismic diffraction events carry direct information about errors in the migration-velocity model. The residual-diffraction-moveout (RDM) migration-velocity-analysis (MVA) method is a recent technique that extracts this information by means of adjusting ellipses or hyperbolas to uncollapsed migrated diffractions. In this paper, we apply this method, which has been tested so far only on synthetic data, to a real data set from the Viking Graben. After application of a plane-wave-destruction (PWD) filter to attenuate the reflected energy, the diffractions in the real data become interpretable and can be used for the RDM method. Our analysis demonstrates that the reflections need not be completely removed for this purpose. Beyond the need to identify and select diffraction events in post-stack migrated sections in the depth domain, the method has a very low computational cost and processing time. To reach an acceptable velocity model of comparable quality as one obtained with common-midpoint (CMP) processing, only two iterations were necessary.
NASA Astrophysics Data System (ADS)
Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo
2017-10-01
This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.
Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A
2018-04-15
For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We propose a method for the application of single event upset (SEU) data towards the analysis of complex systems using transformed reliability models (from the time domain to the particle fluence domain) and space environment data.
NASA Astrophysics Data System (ADS)
Arling, J.-H.; Gerhardt, M.; Gößling, C.; Gehre, D.; Klingenberg, R.; Kröninger, K.; Nitsch, C.; Quante, T.; Rohatsch, K.; Tebrügge, J.; Temminghoff, R.; Theinert, R.; Zatschler, S.; Zuber, K.
2017-11-01
The COBRA collaboration searches for neutrinoless double beta-decay (0νββ-decay) using CdZnTe semiconductor detectors with a coplanar-grid readout and a surrounding guard-ring structure. The operation of the COBRA demonstrator at the Gran Sasso underground laboratory (LNGS) indicates that alpha-induced lateral surface events are the dominant source of background events. By instrumenting the guard-ring electrode it is possible to suppress this type of background. In laboratory measurements this method achieved a suppression factor of alpha-induced lateral surface events of 5300+2660-1380, while retaining (85.3 ±0.1%) of gamma events occurring in the entire detector volume. This suppression is superior to the pulse-shape analysis methods used so far in COBRA by three orders of magnitude.
Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka
2018-01-08
Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.
NASA Astrophysics Data System (ADS)
Saikia, C. K.; Roman-nieves, J. I.; Woods, M. T.
2013-12-01
Source parameters of nuclear and chemical explosions are often estimated by matching either the corner frequency and spectral level of a single event or the spectral ratio when spectra from two events are available with known source parameters for one. In this study, we propose an alternative method in which waveforms from two or more events can be simultaneously equalized by setting the differential of the processed seismograms at one station from any two individual events to zero. The method involves convolving the equivalent Mueller-Murphy displacement source time function (MMDSTF) of one event with the seismogram of the second event and vice-versa, and then computing their difference seismogram. MMDSTF is computed at the elastic radius including both near and far-field terms. For this method to yield accurate source parameters, an inherent assumption is that green's functions for the any paired events from the source to a receiver are same. In the frequency limit of the seismic data, this is a reasonable assumption and is concluded based on the comparison of green's functions computed for flat-earth models at various source depths ranging from 100m to 1Km. Frequency domain analysis of the initial P wave is, however, sensitive to the depth phase interaction, and if tracked meticulously can help estimating the event depth. We applied this method to the local waveforms recorded from the three SPE shots and precisely determined their yields. These high-frequency seismograms exhibit significant lateral path effects in spectrogram analysis and 3D numerical computations, but the source equalization technique is independent of any variation as long as their instrument characteristics are well preserved. We are currently estimating the uncertainty in the derived source parameters assuming the yields of the SPE shots as unknown. We also collected regional waveforms from 95 NTS explosions at regional stations ALQ, ANMO, CMB, COR, JAS LON, PAS, PFO and RSSD. We are currently employing a station based analysis using the equalization technique to estimate depth and yields of many relative to those of the announced explosions; and to develop their relationship with the Mw and Mo for the NTS explosions.
Explosive Yield Estimation using Fourier Amplitude Spectra of Velocity Histories
NASA Astrophysics Data System (ADS)
Steedman, D. W.; Bradley, C. R.
2016-12-01
The Source Physics Experiment (SPE) is a series of explosive shots of various size detonated at varying depths in a borehole in jointed granite. The testbed includes an extensive array of accelerometers for measuring the shock environment close-in to the explosive source. One goal of SPE is to develop greater understanding of the explosion phenomenology in all regimes: from near-source, non-linear response to the far-field linear elastic region, and connecting the analyses from the respective regimes. For example, near-field analysis typically involves review of kinematic response (i.e., acceleration, velocity and displacement) in the time domain and looks at various indicators (e.g., peaks, pulse duration) to facilitate comparison among events. Review of far-field data more often is based on study of response in the frequency domain to facilitate comparison of event magnitudes. To try to "bridge the gap" between approaches, we have developed a scaling law for Fourier amplitude spectra of near-field velocity histories that successfully collapses data from a wide range of yields (100 kg to 5000 kg) and range to sensors in jointed granite. Moreover, we show that we can apply this scaling law to data from a new event to accurately estimate the explosive yield of that event. This approach presents a new way of working with near-field data that will be more compatible with traditional methods of analysis of seismic data and should serve to facilitate end-to-end event analysis. The goal is that this new approach to data analysis will eventually result in improved methods for discrimination of event type (i.e., nuclear or chemical explosion, or earthquake) and magnitude.
Optimal filter parameters for low SNR seismograms as a function of station and event location
NASA Astrophysics Data System (ADS)
Leach, Richard R.; Dowla, Farid U.; Schultz, Craig A.
1999-06-01
Global seismic monitoring requires deployment of seismic sensors worldwide, in many areas that have not been studied or have few useable recordings. Using events with lower signal-to-noise ratios (SNR) would increase the amount of data from these regions. Lower SNR events can add significant numbers to data sets, but recordings of these events must be carefully filtered. For a given region, conventional methods of filter selection can be quite subjective and may require intensive analysis of many events. To reduce this laborious process, we have developed an automated method to provide optimal filters for low SNR regional or teleseismic events. As seismic signals are often localized in frequency and time with distinct time-frequency characteristics, our method is based on the decomposition of a time series into a set of subsignals, each representing a band with f/Δ f constant (constant Q). The SNR is calculated on the pre-event noise and signal window. The band pass signals with high SNR are used to indicate the cutoff filter limits for the optimized filter. Results indicate a significant improvement in SNR, particularly for low SNR events. The method provides an optimum filter which can be immediately applied to unknown regions. The filtered signals are used to map the seismic frequency response of a region and may provide improvements in travel-time picking, azimuth estimation, regional characterization, and event detection. For example, when an event is detected and a preliminary location is determined, the computer could automatically select optimal filter bands for data from non-reporting stations. Results are shown for a set of low SNR events as well as 379 regional and teleseismic events recorded at stations ABKT, KIV, and ANTO in the Middle East.
Semiparametric Time-to-Event Modeling in the Presence of a Latent Progression Event
Rice, John D.; Tsodikov, Alex
2017-01-01
Summary In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood–based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. PMID:27556886
Semiparametric time-to-event modeling in the presence of a latent progression event.
Rice, John D; Tsodikov, Alex
2017-06-01
In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. © 2016, The International Biometric Society.
Regression Analysis of Mixed Panel Count Data with Dependent Terminal Events
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L.
2017-01-01
Event history studies are commonly conducted in many fields and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data above, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally the methodology is applied to a childhood cancer study that motivated this study. PMID:28098397
A study of various methods for calculating locations of lightning events
NASA Technical Reports Server (NTRS)
Cannon, John R.
1995-01-01
This article reports on the results of numerical experiments on finding the location of lightning events using different numerical methods. The methods include linear least squares, nonlinear least squares, statistical estimations, cluster analysis and angular filters and combinations of such techniques. The experiments involved investigations of methods for excluding fake solutions which are solutions that appear to be reasonable but are in fact several kilometers distant from the actual location. Some of the conclusions derived from the study are that bad data produces fakes, that no fool-proof method of excluding fakes was found, that a short base-line interferometer under development at Kennedy Space Center to measure the direction cosines of an event shows promise as a filter for excluding fakes. The experiments generated a number of open questions, some of which are discussed at the end of the report.
Workshop on Algorithms for Time-Series Analysis
NASA Astrophysics Data System (ADS)
Protopapas, Pavlos
2012-04-01
abstract-type="normal">SummaryThis Workshop covered the four major subjects listed below in two 90-minute sessions. Each talk or tutorial allowed questions, and concluded with a discussion. Classification: Automatic classification using machine-learning methods is becoming a standard in surveys that generate large datasets. Ashish Mahabal (Caltech) reviewed various methods, and presented examples of several applications. Time-Series Modelling: Suzanne Aigrain (Oxford University) discussed autoregressive models and multivariate approaches such as Gaussian Processes. Meta-classification/mixture of expert models: Karim Pichara (Pontificia Universidad Católica, Chile) described the substantial promise which machine-learning classification methods are now showing in automatic classification, and discussed how the various methods can be combined together. Event Detection: Pavlos Protopapas (Harvard) addressed methods of fast identification of events with low signal-to-noise ratios, enlarging on the characterization and statistical issues of low signal-to-noise ratios and rare events.
Doğramac, Sera N; Watsford, Mark L; Murphy, Aron J
2011-03-01
Subjective notational analysis can be used to track players and analyse movement patterns during match-play of team sports such as futsal. The purpose of this study was to establish the validity and reliability of the Event Recorder for subjective notational analysis. A course was designed, replicating ten minutes of futsal match-play movement patterns, where ten participants undertook the course. The course allowed a comparison of data derived from subjective notational analysis, to the known distances of the course, and to GPS data. The study analysed six locomotor activity categories, focusing on total distance covered, total duration of activities and total frequency of activities. The values between the known measurements and the Event Recorder were similar, whereas the majority of significant differences were found between the Event Recorder and GPS values. The reliability of subjective notational analysis was established with all ten participants being analysed on two occasions, as well as analysing five random futsal players twice during match-play. Subjective notational analysis is a valid and reliable method of tracking player movements, and may be a preferred and more effective method than GPS, particularly for indoor sports such as futsal, and field sports where short distances and changes in direction are observed.
Fractal analysis of GPS time series for early detection of disastrous seismic events
NASA Astrophysics Data System (ADS)
Filatov, Denis M.; Lyubushin, Alexey A.
2017-03-01
A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.
Regression analysis of mixed recurrent-event and panel-count data
Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.
2014-01-01
In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20, 1–42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. PMID:24648408
NASA Technical Reports Server (NTRS)
Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.
1985-01-01
Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.
Adjusting for multiple prognostic factors in the analysis of randomised trials
2013-01-01
Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not generally need to depend on the method of randomisation used. Most methods of analysis work well with large sample sizes, however treating strata as random effects should be the analysis method of choice with binary or time-to-event outcomes and a small sample size. PMID:23898993
Event identification by acoustic signature recognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dress, W.B.; Kercel, S.W.
1995-07-01
Many events of interest to the security commnnity produce acoustic emissions that are, in principle, identifiable as to cause. Some obvious examples are gunshots, breaking glass, takeoffs and landings of small aircraft, vehicular engine noises, footsteps (high frequencies when on gravel, very low frequencies. when on soil), and voices (whispers to shouts). We are investigating wavelet-based methods to extract unique features of such events for classification and identification. We also discuss methods of classification and pattern recognition specifically tailored for acoustic signatures obtained by wavelet analysis. The paper is divided into three parts: completed work, work in progress, and futuremore » applications. The completed phase has led to the successful recognition of aircraft types on landing and takeoff. Both small aircraft (twin-engine turboprop) and large (commercial airliners) were included in the study. The project considered the design of a small, field-deployable, inexpensive device. The techniques developed during the aircraft identification phase were then adapted to a multispectral electromagnetic interference monitoring device now deployed in a nuclear power plant. This is a general-purpose wavelet analysis engine, spanning 14 octaves, and can be adapted for other specific tasks. Work in progress is focused on applying the methods previously developed to speaker identification. Some of the problems to be overcome include recognition of sounds as voice patterns and as distinct from possible background noises (e.g., music), as well as identification of the speaker from a short-duration voice sample. A generalization of the completed work and the work in progress is a device capable of classifying any number of acoustic events-particularly quasi-stationary events such as engine noises and voices and singular events such as gunshots and breaking glass. We will show examples of both kinds of events and discuss their recognition likelihood.« less
ANTARES constrains a blazar origin of two IceCube PeV neutrino events
NASA Astrophysics Data System (ADS)
ANTARES Collaboration; Adrián-Martínez, S.; Albert, A.; André, M.; Anton, G.; Ardid, M.; Aubert, J.-J.; Baret, B.; Barrios, J.; Basa, S.; Bertin, V.; Biagi, S.; Bogazzi, C.; Bormuth, R.; Bou-Cabo, M.; Bouwhuis, M. C.; Bruijn, R.; Brunner, J.; Busto, J.; Capone, A.; Caramete, L.; Carr, J.; Chiarusi, T.; Circella, M.; Coniglione, R.; Costantini, H.; Coyle, P.; Creusot, A.; De Rosa, G.; Dekeyser, I.; Deschamps, A.; De Bonis, G.; Distefano, C.; Donzaud, C.; Dornic, D.; Dorosti, Q.; Drouhin, D.; Dumas, A.; Eberl, T.; Enzenhöfer, A.; Escoffier, S.; Fehn, K.; Felis, I.; Fermani, P.; Folger, F.; Fusco, L. A.; Galatà, S.; Gay, P.; Geißelsöder, S.; Geyer, K.; Giordano, V.; Gleixner, A.; Gómez-González, J. P.; Gracia-Ruiz, R.; Graf, K.; van Haren, H.; Heijboer, A. J.; Hello, Y.; Hernández-Rey, J. J.; Herrero, A.; Hößl, J.; Hofestädt, J.; Hugon, C.; James, C. W.; de Jong, M.; Kalekin, O.; Katz, U.; Kießling, D.; Kooijman, P.; Kouchner, A.; Kulikovskiy, V.; Lahmann, R.; Lattuada, D.; Lefèvre, D.; Leonora, E.; Loehner, H.; Loucatos, S.; Mangano, S.; Marcelin, M.; Margiotta, A.; Martínez-Mora, J. A.; Martini, S.; Mathieu, A.; Michael, T.; Migliozzi, P.; Neff, M.; Nezri, E.; Palioselitis, D.; Păvălaş, G. E.; Perrina, C.; Piattelli, P.; Popa, V.; Pradier, T.; Racca, C.; Riccobene, G.; Richter, R.; Roensch, K.; Rostovtsev, A.; Saldaña, M.; Samtleben, D. F. E.; Sánchez-Losa, A.; Sanguineti, M.; Sapienza, P.; Schmid, J.; Schnabel, J.; Schulte, S.; Schüssler, F.; Seitz, T.; Sieger, C.; Spies, A.; Spurio, M.; Steijger, J. J. M.; Stolarczyk, Th.; Taiuti, M.; Tamburini, C.; Tayalati, Y.; Trovato, A.; Tselengidou, M.; Tönnis, C.; Vallage, B.; Vallée, C.; Van Elewyck, V.; Visser, E.; Vivolo, D.; Wagner, S.; de Wolf, E.; Yepes, H.; Zornoza, J. D.; Zúñiga, J.; TANAMI Collaboration; Krauß, F.; Kadler, M.; Mannheim, K.; Schulz, R.; Trüstedt, J.; Wilms, J.; Ojha, R.; Ros, E.; Baumgartner, W.; Beuchert, T.; Blanchard, J.; Bürkel, C.; Carpenter, B.; Edwards, P. G.; Eisenacher Glawion, D.; Elsässer, D.; Fritsch, U.; Gehrels, N.; Gräfe, C.; Großberger, C.; Hase, H.; Horiuchi, S.; Kappes, A.; Kreikenbohm, A.; Kreykenbohm, I.; Langejahn, M.; Leiter, K.; Litzinger, E.; Lovell, J. E. J.; Müller, C.; Phillips, C.; Plötz, C.; Quick, J.; Steinbring, T.; Stevens, J.; Thompson, D. J.; Tzioumis, A. K.
2015-04-01
Context. The source(s) of the neutrino excess reported by the IceCube Collaboration is unknown. The TANAMI Collaboration recently reported on the multiwavelength emission of six bright, variable blazars which are positionally coincident with two of the most energetic IceCube events. Objects like these are prime candidates to be the source of the highest-energy cosmic rays, and thus of associated neutrino emission. Aims: We present an analysis of neutrino emission from the six blazars using observations with the ANTARES neutrino telescope. Methods: The standard methods of the ANTARES candidate list search are applied to six years of data to search for an excess of muons - and hence their neutrino progenitors - from the directions of the six blazars described by the TANAMI Collaboration, and which are possibly associated with two IceCube events. Monte Carlo simulations of the detector response to both signal and background particle fluxes are used to estimate the sensitivity of this analysis for different possible source neutrino spectra. A maximum-likelihood approach, using the reconstructed energies and arrival directions of through-going muons, is used to identify events with properties consistent with a blazar origin. Results: Both blazars predicted to be the most neutrino-bright in the TANAMI sample (1653-329 and 1714-336) have a signal flux fitted by the likelihood analysis corresponding to approximately one event. This observation is consistent with the blazar-origin hypothesis of the IceCube event IC 14 for a broad range of blazar spectra, although an atmospheric origin cannot be excluded. No ANTARES events are observed from any of the other four blazars, including the three associated with IceCube event IC20. This excludes at a 90% confidence level the possibility that this event was produced by these blazars unless the neutrino spectrum is flatter than -2.4. Figures 2, 3 and Appendix A are available in electronic form at http://www.aanda.org
Systematic inference of functional phosphorylation events in yeast metabolism.
Chen, Yu; Wang, Yonghong; Nielsen, Jens
2017-07-01
Protein phosphorylation is a post-translational modification that affects proteins by changing their structure and conformation in a rapid and reversible way, and it is an important mechanism for metabolic regulation in cells. Phosphoproteomics enables high-throughput identification of phosphorylation events on metabolic enzymes, but identifying functional phosphorylation events still requires more detailed biochemical characterization. Therefore, development of computational methods for investigating unknown functions of a large number of phosphorylation events identified by phosphoproteomics has received increased attention. We developed a mathematical framework that describes the relationship between phosphorylation level of a metabolic enzyme and the corresponding flux through the enzyme. Using this framework, it is possible to quantitatively estimate contribution of phosphorylation events to flux changes. We showed that phosphorylation regulation analysis, combined with a systematic workflow and correlation analysis, can be used for inference of functional phosphorylation events in steady and dynamic conditions, respectively. Using this analysis, we assigned functionality to phosphorylation events of 17 metabolic enzymes in the yeast Saccharomyces cerevisiae , among which 10 are novel. Phosphorylation regulation analysis cannot only be extended for inference of other functional post-translational modifications but also be a promising scaffold for multi-omics data integration in systems biology. Matlab codes for flux balance analysis in this study are available in Supplementary material. yhwang@ecust.edu.cn or nielsenj@chalmers.se. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
NASA Astrophysics Data System (ADS)
Mendes, Odim; Oliveira Domingues, Margarete; Echer, Ezequiel; Hajra, Rajkumar; Everton Menconi, Varlei
2017-08-01
Considering the magnetic reconnection and the viscous interaction as the fundamental mechanisms for transfer particles and energy into the magnetosphere, we study the dynamical characteristics of auroral electrojet (AE) index during high-intensity, long-duration continuous auroral activity (HILDCAA) events, using a long-term geomagnetic database (1975-2012), and other distinct interplanetary conditions (geomagnetically quiet intervals, co-rotating interaction regions (CIRs)/high-speed streams (HSSs) not followed by HILDCAAs, and events of AE comprised in global intense geomagnetic disturbances). It is worth noting that we also study active but non-HILDCAA intervals. Examining the geomagnetic AE index, we apply a dynamics analysis composed of the phase space, the recurrence plot (RP), and the recurrence quantification analysis (RQA) methods. As a result, the quantification finds two distinct clusterings of the dynamical behaviours occurring in the interplanetary medium: one regarding a geomagnetically quiet condition regime and the other regarding an interplanetary activity regime. Furthermore, the HILDCAAs seem unique events regarding a visible, intense manifestations of interplanetary Alfvénic waves; however, they are similar to the other kinds of conditions regarding a dynamical signature (based on RQA), because it is involved in the same complex mechanism of generating geomagnetic disturbances. Also, by characterizing the proper conditions of transitions from quiescent conditions to weaker geomagnetic disturbances inside the magnetosphere and ionosphere system, the RQA method indicates clearly the two fundamental dynamics (geomagnetically quiet intervals and HILDCAA events) to be evaluated with magneto-hydrodynamics simulations to understand better the critical processes related to energy and particle transfer into the magnetosphere-ionosphere system. Finally, with this work, we have also reinforced the potential applicability of the RQA method for characterizing nonlinear geomagnetic processes related to the magnetic reconnection and the viscous interaction affecting the magnetosphere.
Efficient method for events detection in phonocardiographic signals
NASA Astrophysics Data System (ADS)
Martinez-Alajarin, Juan; Ruiz-Merino, Ramon
2005-06-01
The auscultation of the heart is still the first basic analysis tool used to evaluate the functional state of the heart, as well as the first indicator used to submit the patient to a cardiologist. In order to improve the diagnosis capabilities of auscultation, signal processing algorithms are currently being developed to assist the physician at primary care centers for adult and pediatric population. A basic task for the diagnosis from the phonocardiogram is to detect the events (main and additional sounds, murmurs and clicks) present in the cardiac cycle. This is usually made by applying a threshold and detecting the events that are bigger than the threshold. However, this method usually does not allow the detection of the main sounds when additional sounds and murmurs exist, or it may join several events into a unique one. In this paper we present a reliable method to detect the events present in the phonocardiogram, even in the presence of heart murmurs or additional sounds. The method detects relative maxima peaks in the amplitude envelope of the phonocardiogram, and computes a set of parameters associated with each event. Finally, a set of characteristics is extracted from each event to aid in the identification of the events. Besides, the morphology of the murmurs is also detected, which aids in the differentiation of different diseases that can occur in the same temporal localization. The algorithms have been applied to real normal heart sounds and murmurs, achieving satisfactory results.
Methods for consistent forewarning of critical events across multiple data channels
Hively, Lee M.
2006-11-21
This invention teaches further method improvements to forewarn of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves conversion of time-serial data into equiprobable symbols. A second improvement is a method to maximize the channel-consistent total-true rate of forewarning from a plurality of data channels over multiple data sets from the same patient or process. This total-true rate requires resolution of the forewarning indications into true positives, true negatives, false positives and false negatives. A third improvement is the use of various objective functions, as derived from the phase-space dissimilarity measures, to give the best forewarning indication. A fourth improvement uses various search strategies over the phase-space analysis parameters to maximize said objective functions. A fifth improvement shows the usefulness of the method for various biomedical and machine applications.
Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R
2016-03-01
This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.
Visual search of cyclic spatio-temporal events
NASA Astrophysics Data System (ADS)
Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire
2018-05-01
The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.
DARHT Multi-intelligence Seismic and Acoustic Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.
The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less
Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory
NASA Astrophysics Data System (ADS)
Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi
2018-03-01
With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.
NASA Astrophysics Data System (ADS)
Moser, Stefan; Nau, Siegfried; Salk, Manfred; Thoma, Klaus
2014-02-01
The in situ investigation of dynamic events, ranging from car crash to ballistics, often is key to the understanding of dynamic material behavior. In many cases the important processes and interactions happen on the scale of milli- to microseconds at speeds of 1000 m s-1 or more. Often, 3D information is necessary to fully capture and analyze all relevant effects. High-speed 3D-visualization techniques are thus required for the in situ analysis. 3D-capable optical high-speed methods often are impaired by luminous effects and dust, while flash x-ray based methods usually deliver only 2D data. In this paper, a novel 3D-capable flash x-ray based method, in situ flash x-ray high-speed computed tomography is presented. The method is capable of producing 3D reconstructions of high-speed processes based on an undersampled dataset consisting of only a few (typically 3 to 6) x-ray projections. The major challenges are identified, discussed and the chosen solution outlined. The application is illustrated with an exemplary application of a 1000 m s-1 high-speed impact event on the scale of microseconds. A quantitative analysis of the in situ measurement of the material fragments with a 3D reconstruction with 1 mm voxel size is presented and the results are discussed. The results show that the HSCT method allows gaining valuable visual and quantitative mechanical information for the understanding and interpretation of high-speed events.
Automated Electroglottographic Inflection Events Detection. A Pilot Study.
Codino, Juliana; Torres, María Eugenia; Rubin, Adam; Jackson-Menaldi, Cristina
2016-11-01
Vocal-fold vibration can be analyzed in a noninvasive way by registering impedance changes within the glottis, through electroglottography. The morphology of the electroglottographic (EGG) signal is related to different vibratory patterns. In the literature, a characteristic knee in the descending portion of the signal has been reported. Some EGG signals do not exhibit this particular knee and have other types of events (inflection events) throughout the ascending and/or descending portion of the vibratory cycle. The goal of this work is to propose an automatic method to identify and classify these events. A computational algorithm was developed based on the mathematical properties of the EGG signal, which detects and reports events throughout the contact phase. Retrospective analysis of EGG signals obtained during routine voice evaluation of adult individuals with a variety of voice disorders was performed using the algorithm as well as human raters. Two judges, both experts in clinical voice analysis, and three general speech pathologists performed manual and visual evaluation of the sample set. The results obtained by the automatic method were compared with those of the human raters. Statistical analysis revealed a significant level of agreement. This automatic tool could allow professionals in the clinical setting to obtain an automatic quantitative and qualitative report of such events present in a voice sample, without having to manually analyze the whole EGG signal. In addition, it might provide the speech pathologist with more information that would complement the standard voice evaluation. It could also be a valuable tool in voice research. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Analysis Methods and Models for Small Unit Operations
2006-07-01
wordt in andere studies ogebruikt orn a-an te geven welke op welke wijze operationele effectiviteit kan worden gekwalificeerd en gekwanuificeerd...the node ’Prediction’ is called a child of the node ’Success’ and the node ’Success’ is called a parent of the node ’Prediction’. Figure C.2 A simple...event A is a child of event B and event B is a child of event C ( C -- B -- A). The belief network or influence diagram has to be a directed network
High fold computer disk storage DATABASE for fast extended analysis of γ-rays events
NASA Astrophysics Data System (ADS)
Stézowski, O.; Finck, Ch.; Prévost, D.
1999-03-01
Recently spectacular technical developments have been achieved to increase the resolving power of large γ-ray spectrometers. With these new eyes, physicists are able to study the intricate nature of atomic nuclei. Concurrently more and more complex multidimensional analyses are needed to investigate very weak phenomena. In this article, we first present a software (DATABASE) allowing high fold coincidences γ-rays events to be stored on hard disk. Then, a non-conventional method of analysis, anti-gating procedure, is described. Two physical examples are given to explain how it can be used and Monte Carlo simulations have been performed to test the validity of this method.
SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; James Knudsen
As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less
Assessment of the integrity of concrete bridge structures by acoustic emission technique
NASA Astrophysics Data System (ADS)
Yoon, Dong-Jin; Park, Philip; Jung, Juong-Chae; Lee, Seung-Seok
2002-06-01
This study was aimed at developing a new method for assessing the integrity of concrete structures. Especially acoustic emission technique was used in carrying out both laboratory experiment and field application. From the previous laboratory study, we confirmed that AE analysis provided a promising approach for estimating the level of damage and distress in concrete structures. The Felicity ratio, one of the key parameter for assessing damage, exhibits a favorable correlation with the overall damage level. The total number of AE events under stepwise cyclic loading also showed a good agreement with the damage level. In this study, a new suggested technique was applied to several concrete bridges in Korea in order to verify the applicability in field. The AE response was analyzed to obtain key parameters such as the total number and rate of AE events, AE parameter analysis for each event, and the characteristic features of the waveform as well as Felicity ratio analysis. Stepwise loading-unloading procedure for AE generation was introduced in field test by using each different weight of vehicle. According to the condition of bridge, for instance new or old bridge, AE event rate and AE generation behavior indicated many different aspects. The results showed that the suggested analyzing method would be a promising approach for assessing the integrity of concrete structures.
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Gentili, Stefania
2017-04-01
Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.
Flood hazard assessment in areas prone to flash flooding
NASA Astrophysics Data System (ADS)
Kvočka, Davor; Falconer, Roger A.; Bray, Michaela
2016-04-01
Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.
Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources
NASA Astrophysics Data System (ADS)
Jia, Z.; Zhan, Z.
2017-12-01
Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.
Developing future precipitation events from historic events: An Amsterdam case study.
NASA Astrophysics Data System (ADS)
Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen
2016-04-01
Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two methodologies are statistically compared and evaluated. The comparison between the historic event generated by the model and the observed event will give information on the realism of the model for this event. The comparison between the delta transformation method and the future simulation will provide information on how the dynamics would affect the precipitation field, as compared to the statistical method.
Taheriyoun, Masoud; Moradinejad, Saber
2015-01-01
The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Jusufovic, Mirza; Sandset, Else Charlotte; Bath, Philip M; Berge, Eivind
2016-08-01
Early blood pressure-lowering treatment appears to be beneficial in patients with acute intracerebral haemorrhage and potentially in ischaemic stroke. We used a new method for analysis of vascular events in the Scandinavian Candesartan Acute Stroke Trial to see if the effect was dependent on the timing of treatment. Scandinavian Candesartan Acute Stroke Trial was a randomized controlled and placebo-controlled trial of candesartan within 30 h of ischaemic or haemorrhagic stroke. Of 2029 patients, 231 (11.4%) had a vascular event (vascular death, nonfatal stroke or nonfatal myocardial infarction) during the first 6 months. The modified Rankin Scale (mRS) score following a vascular event was used to categorize vascular events in order of severity: no event (n = 1798), minor (mRS 0-2, n = 59), moderately severe (mRS 3-4, n = 57) and major event (mRS 5-6, n = 115). We used ordinal logistic regression for analysis and adjusted for predefined prognostic variables. Candesartan had no overall effect on vascular events (adjusted common odds ratio 1.11, 95% confidence interval 0.84-1.47, P = 0.48), and the effects were the same in ischaemic and haemorrhagic stroke. Among the patients treated within 6 h, the adjusted common odds ratio for vascular events was 0.37, 95% confidence interval 0.16-0.84, P = 0.02, and there was no heterogeneity of effect between ischaemic and haemorrhagic strokes. Ordinal analysis of vascular events showed no overall effect of candesartan in the subacute phase of stroke. The effect of treatment given within 6 h of stroke onset appears promising, and will be addressed in ongoing trials. Ordinal analysis of vascular events is feasible and can be used in future trials.
NASA Astrophysics Data System (ADS)
Dibike, Y. B.; Eum, H. I.; Prowse, T. D.
2017-12-01
Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.
NASA Astrophysics Data System (ADS)
White, Joshua S.; Hall, Robert T.; Fields, Jeremy; White, Holly M.
2016-05-01
Utilization of traditional sentiment analysis for predicting the outcome of an event on a social network depends on: precise understanding of what topics relate to the event, selective elimination of trends that don't fit, and in most cases, expert knowledge of major players of the event. Sentiment analysis has traditionally taken one of two approaches to derive a quantitative value from qualitative text. These approaches include the bag of words model", and the usage of "NLP" to attempt a real understanding of the text. Each of these methods yield very similar accuracy results with the exception of some special use cases. To do so, however, they both impose a large computational burden on the analytic system. Newer approaches have this same problem. No matter what approach is used, SA typically caps out around 80% in accuracy. However, accuracy is the result of both polarity and degree of polarity, nothing else. In this paper we present a method for hybridizing traditional SA methods to better determine shifts in opinion over time within social networks. This hybridization process involves augmenting traditional SA measurements with contextual understanding, and knowledge about writers' demographics. Our goal is to not only to improve accuracy, but to do so with minimal impact to computation requirements.
Flood Frequency Analyses Using a Modified Stochastic Storm Transposition Method
NASA Astrophysics Data System (ADS)
Fang, N. Z.; Kiani, M.
2015-12-01
Research shows that areas with similar topography and climatic environment have comparable precipitation occurrences. Reproduction and realization of historical rainfall events provide foundations for frequency analysis and the advancement of meteorological studies. Stochastic Storm Transposition (SST) is a method for such a purpose and enables us to perform hydrologic frequency analyses by transposing observed historical storm events to the sites of interest. However, many previous studies in SST reveal drawbacks from simplified Probability Density Functions (PDFs) without considering restrictions for transposing rainfalls. The goal of this study is to stochastically examine the impacts of extreme events on all locations in a homogeneity zone. Since storms with the same probability of occurrence on homogenous areas do not have the identical hydrologic impacts, the authors utilize detailed precipitation parameters including the probability of occurrence of certain depth and the number of occurrence of extreme events, which are both incorporated into a joint probability function. The new approach can reduce the bias from uniformly transposing storms which erroneously increases the probability of occurrence of storms in areas with higher rainfall depths. This procedure is iterated to simulate storm events for one thousand years as the basis for updating frequency analysis curves such as IDF and FFA. The study area is the Upper Trinity River watershed including the Dallas-Fort Worth metroplex with a total area of 6,500 mi2. It is the first time that SST method is examined in such a wide scale with 20 years of radar rainfall data.
Dual Method Use Among a Sample of First-Year College Women
Walsh, Jennifer L.; Fielder, Robyn L.; Carey, Kate B.; Carey, Michael P.
2014-01-01
CONTEXT Dual method use—using one type of contraceptive to reduce the risk of STDs and another to prevent pregnancy—is effective but understudied. No prior studies have employed an event-level approach to examining characteristics associated with dual method use among college women. METHODS In 12 consecutive monthly surveys conducted in 2009–2010, data on 1,843 vaginal intercourse events were collected from 296 first-year college women. Women reported on their use of condoms and hormonal contraceptives during all events. Multilevel regression analysis was used to assess associations between event-, month- and person-level characteristics and hormonal use and dual method use. RESULTS Women used hormonal contraceptives during 53% of events and condoms during 63%. Dual method use was reported 28% of the time, and only 14% of participants were consistent users of both methods. The likelihood of dual method use was elevated when sex partners were friends as opposed to romantic partners or ex-boyfriends, and among women who had received an STD diagnosis prior to college (odds ratios, 2.5–2.9); it also increased with level of religiosity (coefficient, 0.8). Dual use was less likely when less reliable methods were used (odds ratio, 0.2) and when women reported more months of hormonal use (0.8), were older (coefficient, −4.7) and had had a greater number of partners before college (−0.3). CONCLUSIONS A better understanding of the characteristics associated with dual method use may help in the design of potential intervention efforts. PMID:24684480
Regression analysis of mixed panel count data with dependent terminal events.
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L
2017-05-10
Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Validation of Rapid Radiochemical Method for Californium ...
Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.
Analysis of selected microflares observed by SphinX over the last minimum of solar activity
NASA Astrophysics Data System (ADS)
Siarkowski, Marek; Sylwester, Janusz; Sylwester, Barbara; Gryciuk, Magdalena
The Solar Photometer in X-rays (SphinX) was designed to observe soft X-ray solar emission in the energy range between 1 keV and 15 keV with the resolution better than 0.5 keV. The instrument operated from February until November 2009 aboard CORONAS-Photon satellite, during the phase of exceptionally low minimum of solar activity. Here we use SphinX data for analysis of selected microflare-class events. We selected events of unusual lightcurves or location. Our study involves determination of temporal characteristics (times of start, maximum and end of flares) and analysis of physical conditions in flaring plasma (temperature, emission measure). Dedicated method has been used in order to remove emission not related to flare. Supplementary information about morphology and evolution of investigated events has been derived from the analysis of XRT/Hinode and SECCHI /STEREO images.
Extreme values in the Chinese and American stock markets based on detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Cao, Guangxi; Zhang, Minjia
2015-10-01
This paper focuses on the comparative analysis of extreme values in the Chinese and American stock markets based on the detrended fluctuation analysis (DFA) algorithm using the daily data of Shanghai composite index and Dow Jones Industrial Average. The empirical results indicate that the multifractal detrended fluctuation analysis (MF-DFA) method is more objective than the traditional percentile method. The range of extreme value of Dow Jones Industrial Average is smaller than that of Shanghai composite index, and the extreme value of Dow Jones Industrial Average is more time clustering. The extreme value of the Chinese or American stock markets is concentrated in 2008, which is consistent with the financial crisis in 2008. Moreover, we investigate whether extreme events affect the cross-correlation between the Chinese and American stock markets using multifractal detrended cross-correlation analysis algorithm. The results show that extreme events have nothing to do with the cross-correlation between the Chinese and American stock markets.
Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko
2016-07-01
A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation of the induced microseismicity and novel insights into dynamic rupture processes based on the average temporal (foreshock-aftershock) relationship of child events to parents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro
This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less
Wang, Cai -Lin; Riedel, Richard A.
2016-01-14
A 6Li-glass scintillator (GS20) based neutron Anger camera was developed for time-of-flight single-crystal diffraction instruments at SNS. Traditional pulse-height analysis (PHA) for neutron-gamma discrimination (NGD) resulted in the neutron-gamma efficiency ratio (defined as NGD ratio) on the order of 10 4. The NGD ratios of Anger cameras need to be improved for broader applications including neutron reflectometers. For this purpose, five digital signal analysis methods of individual waveforms from PMTs were proposed using: i). pulse-amplitude histogram; ii). power spectrum analysis combined with the maximum pulse amplitude; iii). two event parameters (a 1, b 0) obtained from Wiener filter; iv). anmore » effective amplitude (m) obtained from an adaptive least-mean-square (LMS) filter; and v). a cross-correlation (CC) coefficient between an individual waveform and a reference. The NGD ratios can be 1-102 times those from traditional PHA method. A brighter scintillator GS2 has better NGD ratio than GS20, but lower neutron detection efficiency. The ultimate NGD ratio is related to the ambient, high-energy background events. Moreover, our results indicate the NGD capability of neutron Anger cameras can be improved using digital signal analysis methods and brighter neutron scintillators.« less
Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.
IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less
Regression analysis of mixed recurrent-event and panel-count data.
Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L
2014-07-01
In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Supanich, M; Chu, J; Wehmeyer, A
2014-06-15
Purpose: This work offers as a teaching example a reported high dose fluoroscopy case and the workflow the institution followed to self-report a radiation overdose sentinel event to the Joint Commission. Methods: Following the completion of a clinical case in a hybrid OR room with a reported air kerma of >18 Gy at the Interventional Reference Point (IRP) the physicians involved in the case referred study to the institution's Radiation Safety Committee (RSC) for review. The RSC assigned a Diagnostic Medical Physicist (DMP) to estimate the patient's Peak Skin Dose (PSD) and analyze the case. Following the DMP's analysis andmore » estimate of a PSD of >15 Gy the institution's adverse event committee was convened to discuss the case and to self-report the case as a radiation overdose sentinel event to the Joint Commission. The committee assigned a subgroup to perform the root cause analysis and develop institutional responses to the event. Results: The self-reporting of the sentinel event and the associated root cause analysis resulted in several institutional action items that are designed to improve process and safety. A formal reporting and analysis mechanism was adopted to review fluoroscopy cases with air kerma greater than 6 Gy at the IRP. An improved and formalized radiation safety training program for physicians using fluoroscopy equipment was implemented. Additionally efforts already under way to monitor radiation exposure in the Radiology department were expanded to include all fluoroscopy equipment capable of automated dose reporting. Conclusion: The adverse event review process and the root cause analysis following the self-reporting of the sentinel event resulted in policies and procedures that are expected to improve the quality and safe usage of fluoroscopy throughout the institution.« less
NASA Astrophysics Data System (ADS)
Gnyp, Andriy
2009-06-01
Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-10-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.
Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin
2016-01-01
Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086
Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah
2015-08-01
Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.
Ontology-supported research on vaccine efficacy, safety and integrative biological networks.
He, Yongqun
2014-07-01
While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including Vaccine Ontology, Ontology of Adverse Events and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network ('OneNet') Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms.
Ontology-supported Research on Vaccine Efficacy, Safety, and Integrative Biological Networks
He, Yongqun
2016-01-01
Summary While vaccine efficacy and safety research has dramatically progressed with the methods of in silico prediction and data mining, many challenges still exist. A formal ontology is a human- and computer-interpretable set of terms and relations that represent entities in a specific domain and how these terms relate to each other. Several community-based ontologies (including the Vaccine Ontology, Ontology of Adverse Events, and Ontology of Vaccine Adverse Events) have been developed to support vaccine and adverse event representation, classification, data integration, literature mining of host-vaccine interaction networks, and analysis of vaccine adverse events. The author further proposes minimal vaccine information standards and their ontology representations, ontology-based linked open vaccine data and meta-analysis, an integrative One Network (“OneNet”) Theory of Life, and ontology-based approaches to study and apply the OneNet theory. In the Big Data era, these proposed strategies provide a novel framework for advanced data integration and analysis of fundamental biological networks including vaccine immune mechanisms. PMID:24909153
Probabilistic analysis on the failure of reactivity control for the PWR
NASA Astrophysics Data System (ADS)
Sony Tjahyani, D. T.; Deswandri; Sunaryo, G. R.
2018-02-01
The fundamental safety function of the power reactor is to control reactivity, to remove heat from the reactor, and to confine radioactive material. The safety analysis is used to ensure that each parameter is fulfilled during the design and is done by deterministic and probabilistic method. The analysis of reactivity control is important to be done because it will affect the other of fundamental safety functions. The purpose of this research is to determine the failure probability of the reactivity control and its failure contribution on a PWR design. The analysis is carried out by determining intermediate events, which cause the failure of reactivity control. Furthermore, the basic event is determined by deductive method using the fault tree analysis. The AP1000 is used as the object of research. The probability data of component failure or human error, which is used in the analysis, is collected from IAEA, Westinghouse, NRC and other published documents. The results show that there are six intermediate events, which can cause the failure of the reactivity control. These intermediate events are uncontrolled rod bank withdrawal at low power or full power, malfunction of boron dilution, misalignment of control rod withdrawal, malfunction of improper position of fuel assembly and ejection of control rod. The failure probability of reactivity control is 1.49E-03 per year. The causes of failures which are affected by human factor are boron dilution, misalignment of control rod withdrawal and malfunction of improper position for fuel assembly. Based on the assessment, it is concluded that the failure probability of reactivity control on the PWR is still within the IAEA criteria.
Wide coverage biomedical event extraction using multiple partially overlapping corpora
2013-01-01
Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785
Contamination in the MACHO data set and the puzzle of Large Magellanic Cloud microlensing
NASA Astrophysics Data System (ADS)
Griest, Kim; Thomas, Christian L.
2005-05-01
In a recent series of three papers, Belokurov, Evans & Le Du and Evans & Belokurov reanalysed the MACHO collaboration data and gave alternative sets of microlensing events and an alternative optical depth to microlensing towards the Large Magellanic Cloud (LMC). Although these authors examined less than 0.2 per cent of the data, they reported that by using a neural net program they had reliably selected a better (and smaller) set of microlensing candidates. Estimating the optical depth from this smaller set, they claimed that the MACHO collaboration overestimated the optical depth by a significant factor and that the MACHO microlensing experiment is consistent with lensing by known stars in the Milky Way and LMC. As we show below, the analysis by these authors contains several errors, and as a result their conclusions are incorrect. Their efficiency analysis is in error, and since they did not search through the entire MACHO data set, they do not know how many microlensing events their neural net would find in the data nor what optical depth their method would give. Examination of their selected events suggests that their method misses low signal-to-noise ratio events and thus would have lower efficiency than the MACHO selection criteria. In addition, their method is likely to give many more false positives (non-lensing events identified as lensing). Both effects would increase their estimated optical depth. Finally, we note that the EROS discovery that LMC event 23 is a variable star reduces the MACHO collaboration estimates of optical depth and the Macho halo fraction by around 8 per cent, and does open the question of additional contamination.
NASA Astrophysics Data System (ADS)
Costa, Justin A.
The translocation of nucleic acid polymers across cell membranes is a fundamental requirement for complex life and has greatly contributed to genomic molecular evolution. The diversity of pathways that have evolved to transport DNA and RNA across membranes include protein receptors, active and passive transporters, endocytic and pinocytic processes, and various types of nucleic acid conducting channels known as nanopores. We have developed a series of experimental techniques, collectively known as "
Poisson Regression Analysis of Illness and Injury Surveillance Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frome E.L., Watkins J.P., Ellis E.D.
2012-12-12
The Department of Energy (DOE) uses illness and injury surveillance to monitor morbidity and assess the overall health of the work force. Data collected from each participating site include health events and a roster file with demographic information. The source data files are maintained in a relational data base, and are used to obtain stratified tables of health event counts and person time at risk that serve as the starting point for Poisson regression analysis. The explanatory variables that define these tables are age, gender, occupational group, and time. Typical response variables of interest are the number of absences duemore » to illness or injury, i.e., the response variable is a count. Poisson regression methods are used to describe the effect of the explanatory variables on the health event rates using a log-linear main effects model. Results of fitting the main effects model are summarized in a tabular and graphical form and interpretation of model parameters is provided. An analysis of deviance table is used to evaluate the importance of each of the explanatory variables on the event rate of interest and to determine if interaction terms should be considered in the analysis. Although Poisson regression methods are widely used in the analysis of count data, there are situations in which over-dispersion occurs. This could be due to lack-of-fit of the regression model, extra-Poisson variation, or both. A score test statistic and regression diagnostics are used to identify over-dispersion. A quasi-likelihood method of moments procedure is used to evaluate and adjust for extra-Poisson variation when necessary. Two examples are presented using respiratory disease absence rates at two DOE sites to illustrate the methods and interpretation of the results. In the first example the Poisson main effects model is adequate. In the second example the score test indicates considerable over-dispersion and a more detailed analysis attributes the over-dispersion to extra-Poisson variation. The R open source software environment for statistical computing and graphics is used for analysis. Additional details about R and the data that were used in this report are provided in an Appendix. Information on how to obtain R and utility functions that can be used to duplicate results in this report are provided.« less
Method of Testing and Predicting Failures of Electronic Mechanical Systems
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, Frances A.
1996-01-01
A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.
Modeling Eye Gaze Patterns in Clinician-Patient Interaction with Lag Sequential Analysis
Montague, E; Xu, J; Asan, O; Chen, P; Chewning, B; Barrett, B
2011-01-01
Objective The aim of this study was to examine whether lag-sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multi-user health care settings where trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Background Nonverbal communication patterns are important aspects of clinician-patient interactions and may impact patient outcomes. Method Eye gaze behaviors of clinicians and patients in 110-videotaped medical encounters were analyzed using the lag-sequential method to identify significant behavior sequences. Lag-sequential analysis included both event-based lag and time-based lag. Results Results from event-based lag analysis showed that the patients’ gaze followed that of clinicians, while clinicians did not follow patients. Time-based sequential analysis showed that responses from the patient usually occurred within two seconds after the initial behavior of the clinician. Conclusion Our data suggest that the clinician’s gaze significantly affects the medical encounter but not the converse. Application Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs. PMID:22046723
Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies
NASA Astrophysics Data System (ADS)
Brune, Ryan Carl
Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.
Extended time-interval analysis
NASA Astrophysics Data System (ADS)
Fynbo, H. O. U.; Riisager, K.
2014-01-01
Several extensions of the halflife analysis method recently suggested by Horvat and Hardy are put forward. Goodness-of-fit testing is included, and the method is extended to cases where more information is available for each decay event which allows applications also for e.g. γ decay data. The results are tested with Monte Carlo simulations and are applied to the decays of 64Cu and 56Mn.
Risk Analysis of Return Support Material on Gas Compressor Platform Project
NASA Astrophysics Data System (ADS)
Silvianita; Aulia, B. U.; Khakim, M. L. N.; Rosyid, Daniel M.
2017-07-01
On a fixed platforms project are not only carried out by a contractor, but two or more contractors. Cooperation in the construction of fixed platforms is often not according to plan, it is caused by several factors. It takes a good synergy between the contractor to avoid miss communication may cause problems on the project. For the example is about support material (sea fastening, skid shoe and shipping support) used in the process of sending a jacket structure to operation place often does not return to the contractor. It needs a systematic method to overcome the problem of support material. This paper analyses the causes and effects of GAS Compressor Platform that support material is not return, using Fault Tree Analysis (FTA) and Event Tree Analysis (ETA). From fault tree analysis, the probability of top event is 0.7783. From event tree analysis diagram, the contractors lose Rp.350.000.000, - to Rp.10.000.000.000, -.
NASA Astrophysics Data System (ADS)
Šilhán, Karel
2016-01-01
Knowledge of past landslide activity is crucial for understanding landslide behaviour and for modelling potential future landslide occurrence. Dendrogeomorphic approaches represent the most precise methods of landslide dating (where trees annually create tree-rings in the timescale of up to several hundred years). Despite the advantages of these methods, many open questions remain. One of the less researched uncertainties, and the focus of this study, is the impact of two common methods of geomorphic signal extraction on the spatial and temporal results of landslide reconstruction. In total, 93 Norway spruce (Picea abies (L.) Karst.) trees were sampled at one landslide location dominated by block-type movements in the forefield of the Orlické hory Mts., Bohemian Massif. Landslide signals were examined by the classical subjective method based on reaction (compression) wood analysis and by a numerical method based on eccentric growth analysis. The chronology of landslide movements obtained by the mathematical method resulted in twice the number of events detected compared to the subjective method. This finding indicates that eccentric growth is a more accurate indicator for landslide movements than the classical analysis of reaction wood. The reconstructed spatial activity of landslide movements shows a similar distribution of recurrence intervals (Ri) for both methods. The differences (maximally 30% of the total Ri ranges) in results obtained by both methods may be caused by differences in the ability of trees to react to tilting of their stems by a specific growth response (reaction wood formation or eccentric growth). Finally, the ability of trees to record tilting events (by both growth responses) in their tree-ring series was analysed for different decades of tree life. The highest sensitivity to external tilting events occurred at tree ages from 70 to 80 years for reaction wood formation and from 80 to 90 years for eccentric growth response. This means that the ability of P. abies to record geomorphic signals varies with not only eccentric growth responses but also with age.
NASA Astrophysics Data System (ADS)
Nasution, A. H.; Rachmawan, Y. A.
2018-04-01
Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.
Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain
NASA Astrophysics Data System (ADS)
Krauß, Thomas; Fischer, Peter
2016-08-01
In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.
Classification of speech dysfluencies using LPC based parameterization techniques.
Hariharan, M; Chee, Lim Sin; Ai, Ooi Chia; Yaacob, Sazali
2012-06-01
The goal of this paper is to discuss and compare three feature extraction methods: Linear Predictive Coefficients (LPC), Linear Prediction Cepstral Coefficients (LPCC) and Weighted Linear Prediction Cepstral Coefficients (WLPCC) for recognizing the stuttered events. Speech samples from the University College London Archive of Stuttered Speech (UCLASS) were used for our analysis. The stuttered events were identified through manual segmentation and were used for feature extraction. Two simple classifiers namely, k-nearest neighbour (kNN) and Linear Discriminant Analysis (LDA) were employed for speech dysfluencies classification. Conventional validation method was used for testing the reliability of the classifier results. The study on the effect of different frame length, percentage of overlapping, value of ã in a first order pre-emphasizer and different order p were discussed. The speech dysfluencies classification accuracy was found to be improved by applying statistical normalization before feature extraction. The experimental investigation elucidated LPC, LPCC and WLPCC features can be used for identifying the stuttered events and WLPCC features slightly outperforms LPCC features and LPC features.
2011-01-01
Background Readthrough fusions across adjacent genes in the genome, or transcription-induced chimeras (TICs), have been estimated using expressed sequence tag (EST) libraries to involve 4-6% of all genes. Deep transcriptional sequencing (RNA-Seq) now makes it possible to study the occurrence and expression levels of TICs in individual samples across the genome. Methods We performed single-end RNA-Seq on three human prostate adenocarcinoma samples and their corresponding normal tissues, as well as brain and universal reference samples. We developed two bioinformatics methods to specifically identify TIC events: a targeted alignment method using artificial exon-exon junctions within 200,000 bp from adjacent genes, and genomic alignment allowing splicing within individual reads. We performed further experimental verification and characterization of selected TIC and fusion events using quantitative RT-PCR and comparative genomic hybridization microarrays. Results Targeted alignment against artificial exon-exon junctions yielded 339 distinct TIC events, including 32 gene pairs with multiple isoforms. The false discovery rate was estimated to be 1.5%. Spliced alignment to the genome was less sensitive, finding only 18% of those found by targeted alignment in 33-nt reads and 59% of those in 50-nt reads. However, spliced alignment revealed 30 cases of TICs with intervening exons, in addition to distant inversions, scrambled genes, and translocations. Our findings increase the catalog of observed TIC gene pairs by 66%. We verified 6 of 6 predicted TICs in all prostate samples, and 2 of 5 predicted novel distant gene fusions, both private events among 54 prostate tumor samples tested. Expression of TICs correlates with that of the upstream gene, which can explain the prostate-specific pattern of some TIC events and the restriction of the SLC45A3-ELK4 e4-e2 TIC to ERG-negative prostate samples, as confirmed in 20 matched prostate tumor and normal samples and 9 lung cancer cell lines. Conclusions Deep transcriptional sequencing and analysis with targeted and spliced alignment methods can effectively identify TIC events across the genome in individual tissues. Prostate and reference samples exhibit a wide range of TIC events, involving more genes than estimated previously using ESTs. Tissue specificity of TIC events is correlated with expression patterns of the upstream gene. Some TIC events, such as MSMB-NCOA4, may play functional roles in cancer. PMID:21261984
Kuss, O
2015-03-30
Meta-analyses with rare events, especially those that include studies with no event in one ('single-zero') or even both ('double-zero') treatment arms, are still a statistical challenge. In the case of double-zero studies, researchers in general delete these studies or use continuity corrections to avoid them. A number of arguments against both options has been given, and statistical methods that use the information from double-zero studies without using continuity corrections have been proposed. In this paper, we collect them and compare them by simulation. This simulation study tries to mirror real-life situations as completely as possible by deriving true underlying parameters from empirical data on actually performed meta-analyses. It is shown that for each of the commonly encountered effect estimators valid statistical methods are available that use the information from double-zero studies without using continuity corrections. Interestingly, all of them are truly random effects models, and so also the current standard method for very sparse data as recommended from the Cochrane collaboration, the Yusuf-Peto odds ratio, can be improved on. For actual analysis, we recommend to use beta-binomial regression methods to arrive at summary estimates for the odds ratio, the relative risk, or the risk difference. Methods that ignore information from double-zero studies or use continuity corrections should no longer be used. We illustrate the situation with an example where the original analysis ignores 35 double-zero studies, and a superior analysis discovers a clinically relevant advantage of off-pump surgery in coronary artery bypass grafting. Copyright © 2014 John Wiley & Sons, Ltd.
ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains
Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz
2016-01-01
With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734
Posner, Kelly; Oquendo, Maria A.; Gould, Madelyn; Stanley, Barbara; Davies, Mark
2013-01-01
Objective To evaluate the link between antidepressants and suicidal behavior and ideation (suicidality) in youth, adverse events from pediatric clinical trials were classified in order to identify suicidal events. The authors describe the Columbia Classification Algorithm for Suicide Assessment (C-CASA), a standardized suicidal rating system that provided data for the pediatric suicidal risk analysis of antide-pressants conducted by the Food and Drug Administration (FDA). Method Adverse events (N=427) from 25 pediatric antidepressant clinical trials were systematically identified by pharmaceutical companies. Randomly assigned adverse events were evaluated by three of nine independent expert suicidologists using the Columbia classification algorithm. Reliability of the C-CASA ratings and agreement with pharmaceutical company classification were estimated. Results Twenty-six new, possibly suicidal events (behavior and ideation) that were not originally identified by pharmaceutical companies were identified in the C-CASA, and 12 events originally labeled as suicidal by pharmaceutical companies were eliminated, which resulted in a total of 38 discrepant ratings. For the specific label of “suicide attempt,” a relatively low level of agreement was observed between the C-CASA and pharmaceutical company ratings, with the C-CASA reporting a 50% reduction in ratings. Thus, although the C-CASA resulted in the identification of more suicidal events overall, fewer events were classified as suicide attempts. Additionally, the C-CASA ratings were highly reliable (intraclass correlation coefficient [ICC]=0.89). Conclusions Utilizing a methodical, anchored approach to categorizing suicidality provides an accurate and comprehensive identification of suicidal events. The FDA’s audit of the C-CASA demonstrated excellent transportability of this approach. The Columbia algorithm was used to classify suicidal adverse events in the recent FDA adult antidepressant safety analyses and has also been mandated to be applied to all anticonvulsant trials and other centrally acting agents and nonpsychotropic drugs. PMID:17606655
Lee, Chi Hyun; Luo, Xianghua; Huang, Chiung-Yu; DeFor, Todd E; Brunstein, Claudio G; Weisdorf, Daniel J
2016-06-01
Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this article, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. © 2015, The International Biometric Society.
Lee, Chi Hyun; Huang, Chiung-Yu; DeFor, Todd E.; Brunstein, Claudio G.; Weisdorf, Daniel J.
2015-01-01
Summary Infection is one of the most common complications after hematopoietic cell transplantation. Many patients experience infectious complications repeatedly after transplant. Existing statistical methods for recurrent gap time data typically assume that patients are enrolled due to the occurrence of an event of interest, and subsequently experience recurrent events of the same type; moreover, for one-sample estimation, the gap times between consecutive events are usually assumed to be identically distributed. Applying these methods to analyze the post-transplant infection data will inevitably lead to incorrect inferential results because the time from transplant to the first infection has a different biological meaning than the gap times between consecutive recurrent infections. Some unbiased yet inefficient methods include univariate survival analysis methods based on data from the first infection or bivariate serial event data methods based on the first and second infections. In this paper, we propose a nonparametric estimator of the joint distribution of time from transplant to the first infection and the gap times between consecutive infections. The proposed estimator takes into account the potentially different distributions of the two types of gap times and better uses the recurrent infection data. Asymptotic properties of the proposed estimators are established. PMID:26575402
Visualization of Traffic Accidents
NASA Technical Reports Server (NTRS)
Wang, Jie; Shen, Yuzhong; Khattak, Asad
2010-01-01
Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.
ERIC Educational Resources Information Center
Kelly, Nick; Montenegro, Maximiliano; Gonzalez, Carlos; Clasing, Paula; Sandoval, Augusto; Jara, Magdalena; Saurina, Elvira; Alarcón, Rosa
2017-01-01
Purpose: The purpose of this paper is to demonstrate the utility of combining event-centred and variable-centred approaches when analysing big data for higher education institutions. It uses a large, university-wide data set to demonstrate the methodology for this analysis by using the case study method. It presents empirical findings about…
Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan
2009-01-01
Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554
Event-based image recognition applied in tennis training assistance
NASA Astrophysics Data System (ADS)
Wawrzyniak, Zbigniew M.; Kowalski, Adam
2016-09-01
This paper presents a concept of a real-time system for individual tennis training assistance. The system is supposed to provide user (player) with information on his strokes accuracy as well as other training quality parameters such as velocity and rotation of the ball during its flight. The method is based on image processing methods equipped with developed explorative analysis of the events and their description by parameters of the movement. There has been presented the concept for further deployment to create a complete system that could assist tennis player during individual training.
NASA Astrophysics Data System (ADS)
Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.
Tang, Zhongwen
2015-01-01
An analytical way to compute predictive probability of success (PPOS) together with credible interval at interim analysis (IA) is developed for big clinical trials with time-to-event endpoints. The method takes account of the fixed data up to IA, the amount of uncertainty in future data, and uncertainty about parameters. Predictive power is a special type of PPOS. The result is confirmed by simulation. An optimal design is proposed by finding optimal combination of analysis time and futility cutoff based on some PPOS criteria.
Monitoring Seismo-volcanic and Infrasonic Signals at Volcanoes: Mt. Etna Case Study
NASA Astrophysics Data System (ADS)
Cannata, Andrea; Di Grazia, Giuseppe; Aliotta, Marco; Cassisi, Carmelo; Montalto, Placido; Patanè, Domenico
2013-11-01
Volcanoes generate a broad range of seismo-volcanic and infrasonic signals, whose features and variations are often closely related to volcanic activity. The study of these signals is hence very useful in the monitoring and investigation of volcano dynamics. The analysis of seismo-volcanic and infrasonic signals requires specifically developed techniques due to their unique characteristics, which are generally quite distinct compared with tectonic and volcano-tectonic earthquakes. In this work, we describe analysis methods used to detect and locate seismo-volcanic and infrasonic signals at Mt. Etna. Volcanic tremor sources are located using a method based on spatial seismic amplitude distribution, assuming propagation in a homogeneous medium. The tremor source is found by calculating the goodness of the linear regression fit ( R 2) of the log-linearized equation of the seismic amplitude decay with distance. The location method for long-period events is based on the joint computation of semblance and R 2 values, and the location method of very long-period events is based on the application of radial semblance. Infrasonic events and tremor are located by semblance-brightness- and semblance-based methods, respectively. The techniques described here can also be applied to other volcanoes and do not require particular network geometries (such as arrays) but rather simple sparse networks. Using the source locations of all the considered signals, we were able to reconstruct the shallow plumbing system (above sea level) during 2011.
Event-triggered Kalman-consensus filter for two-target tracking sensor networks.
Su, Housheng; Li, Zhenghao; Ye, Yanyan
2017-11-01
This paper is concerned with the problem of event-triggered Kalman-consensus filter for two-target tracking sensor networks. According to the event-triggered protocol and the mean-square analysis, a suboptimal Kalman gain matrix is derived and a suboptimal event-triggered distributed filter is obtained. Based on the Kalman-consensus filter protocol, all sensors which only depend on its neighbors' information can track their corresponding targets. Furthermore, utilizing Lyapunov method and matrix theory, some sufficient conditions are presented for ensuring the stability of the system. Finally, a simulation example is presented to verify the effectiveness of the proposed event-triggered protocol. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Estimating the probability of rare events: addressing zero failure data.
Quigley, John; Revie, Matthew
2011-07-01
Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.
Security Event Recognition for Visual Surveillance
NASA Astrophysics Data System (ADS)
Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.
2017-05-01
With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.
Bayesian analysis of caustic-crossing microlensing events
NASA Astrophysics Data System (ADS)
Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.
2010-06-01
Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.
The cost of conservative synchronization in parallel discrete event simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1990-01-01
The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.
Neural network approach in multichannel auditory event-related potential analysis.
Wu, F Y; Slater, J D; Ramsay, R E
1994-04-01
Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.
Arano, Ichiro; Sugimoto, Tomoyuki; Hamasaki, Toshimitsu; Ohno, Yuko
2010-04-23
Survival analysis methods such as the Kaplan-Meier method, log-rank test, and Cox proportional hazards regression (Cox regression) are commonly used to analyze data from randomized withdrawal studies in patients with major depressive disorder. However, unfortunately, such common methods may be inappropriate when a long-term censored relapse-free time appears in data as the methods assume that if complete follow-up were possible for all individuals, each would eventually experience the event of interest. In this paper, to analyse data including such a long-term censored relapse-free time, we discuss a semi-parametric cure regression (Cox cure regression), which combines a logistic formulation for the probability of occurrence of an event with a Cox proportional hazards specification for the time of occurrence of the event. In specifying the treatment's effect on disease-free survival, we consider the fraction of long-term survivors and the risks associated with a relapse of the disease. In addition, we develop a tree-based method for the time to event data to identify groups of patients with differing prognoses (cure survival CART). Although analysis methods typically adapt the log-rank statistic for recursive partitioning procedures, the method applied here used a likelihood ratio (LR) test statistic from a fitting of cure survival regression assuming exponential and Weibull distributions for the latency time of relapse. The method is illustrated using data from a sertraline randomized withdrawal study in patients with major depressive disorder. We concluded that Cox cure regression reveals facts on who may be cured, and how the treatment and other factors effect on the cured incidence and on the relapse time of uncured patients, and that cure survival CART output provides easily understandable and interpretable information, useful both in identifying groups of patients with differing prognoses and in utilizing Cox cure regression models leading to meaningful interpretations.
Liu, Chengyu; Zhao, Lina; Tang, Hong; Li, Qiao; Wei, Shoushui; Li, Jianqing
2016-08-01
False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.
Vigan, Marie; Stirnemann, Jérôme; Mentré, France
2014-05-01
Analysis of repeated time-to-event data is increasingly performed in pharmacometrics using parametric frailty models. The aims of this simulation study were (1) to assess estimation performance of Stochastic Approximation Expectation Maximization (SAEM) algorithm in MONOLIX, Adaptive Gaussian Quadrature (AGQ), and Laplace algorithm in PROC NLMIXED of SAS and (2) to evaluate properties of test of a dichotomous covariate on occurrence of events. The simulation setting is inspired from an analysis of occurrence of bone events after the initiation of treatment by imiglucerase in patients with Gaucher Disease (GD). We simulated repeated events with an exponential model and various dropout rates: no, low, or high. Several values of baseline hazard model, variability, number of subject, and effect of covariate were studied. For each scenario, 100 datasets were simulated for estimation performance and 500 for test performance. We evaluated estimation performance through relative bias and relative root mean square error (RRMSE). We studied properties of Wald and likelihood ratio test (LRT). We used these methods to analyze occurrence of bone events in patients with GD after starting an enzyme replacement therapy. SAEM with three chains and AGQ algorithms provided good estimates of parameters much better than SAEM with one chain and Laplace which often provided poor estimates. Despite a small number of repeated events, SAEM with three chains and AGQ gave small biases and RRMSE. Type I errors were closed to 5%, and power varied as expected for SAEM with three chains and AGQ. Probability of having at least one event under treatment was 19.1%.
Prins, Theo W; Scholtens, Ingrid M J; Bak, Arno W; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Laurensse, Emile J; Kok, Esther J
2016-12-15
During routine monitoring for GMOs in food in the Netherlands, papaya-containing food supplements were found positive for the genetically modified (GM) elements P-35S and T-nos. The goal of this study was to identify the unknown and EU unauthorised GM papaya event(s). A screening strategy was applied using additional GM screening elements including a newly developed PRSV coat protein PCR. The detected PRSV coat protein PCR product was sequenced and the nucleotide sequence showed identity to PRSV YK strains indigenous to China and Taiwan. The GM events 16-0-1 and 18-2-4 could be identified by amplifying and sequencing events-specific sequences. Further analyses showed that both papaya event 16-0-1 and event 18-2-4 were transformed with the same construct. For use in routine analysis, derived TaqMan qPCR methods for events 16-0-1 and 18-2-4 were developed. Event 16-0-1 was detected in all samples tested whereas event 18-2-4 was detected in one sample. This study presents a strategy for combining information from different sources (literature, patent databases) and novel sequence data to identify unknown GM papaya events. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Conversion events in gene clusters
2011-01-01
Background Gene clusters containing multiple similar genomic regions in close proximity are of great interest for biomedical studies because of their associations with inherited diseases. However, such regions are difficult to analyze due to their structural complexity and their complicated evolutionary histories, reflecting a variety of large-scale mutational events. In particular, conversion events can mislead inferences about the relationships among these regions, as traced by traditional methods such as construction of phylogenetic trees or multi-species alignments. Results To correct the distorted information generated by such methods, we have developed an automated pipeline called CHAP (Cluster History Analysis Package) for detecting conversion events. We used this pipeline to analyze the conversion events that affected two well-studied gene clusters (α-globin and β-globin) and three gene clusters for which comparative sequence data were generated from seven primate species: CCL (chemokine ligand), IFN (interferon), and CYP2abf (part of cytochrome P450 family 2). CHAP is freely available at http://www.bx.psu.edu/miller_lab. Conclusions These studies reveal the value of characterizing conversion events in the context of studying gene clusters in complex genomes. PMID:21798034
Social media analysis during political turbulence
Spiliotopoulos, Dimitris; V. Samaras, Christos; Pratikakis, Polyvios; Ioannidis, Sotiris; Fragopoulou, Paraskevi
2017-01-01
Today, a considerable proportion of the public political discourse on nationwide elections proceeds in Online Social Networks. Through analyzing this content, we can discover the major themes that prevailed during the discussion, investigate the temporal variation of positive and negative sentiment and examine the semantic proximity of these themes. According to existing studies, the results of similar tasks are heavily dependent on the quality and completeness of dictionaries for linguistic preprocessing, entity discovery and sentiment analysis. Additionally, noise reduction is achieved with methods for sarcasm detection and correction. Here we report on the application of these methods on the complete corpus of tweets regarding two local electoral events of worldwide impact: the Greek referendum of 2015 and the subsequent legislative elections. To this end, we compiled novel dictionaries for sentiment and entity detection for the Greek language tailored to these events. We subsequently performed volume analysis, sentiment analysis, sarcasm correction and topic modeling. Results showed that there was a strong anti-austerity sentiment accompanied with a critical view on European and Greek political actions. PMID:29088263
Social media analysis during political turbulence.
Antonakaki, Despoina; Spiliotopoulos, Dimitris; V Samaras, Christos; Pratikakis, Polyvios; Ioannidis, Sotiris; Fragopoulou, Paraskevi
2017-01-01
Today, a considerable proportion of the public political discourse on nationwide elections proceeds in Online Social Networks. Through analyzing this content, we can discover the major themes that prevailed during the discussion, investigate the temporal variation of positive and negative sentiment and examine the semantic proximity of these themes. According to existing studies, the results of similar tasks are heavily dependent on the quality and completeness of dictionaries for linguistic preprocessing, entity discovery and sentiment analysis. Additionally, noise reduction is achieved with methods for sarcasm detection and correction. Here we report on the application of these methods on the complete corpus of tweets regarding two local electoral events of worldwide impact: the Greek referendum of 2015 and the subsequent legislative elections. To this end, we compiled novel dictionaries for sentiment and entity detection for the Greek language tailored to these events. We subsequently performed volume analysis, sentiment analysis, sarcasm correction and topic modeling. Results showed that there was a strong anti-austerity sentiment accompanied with a critical view on European and Greek political actions.
Sport events and climate for visitors—the case of FIFA World Cup in Qatar 2022
NASA Astrophysics Data System (ADS)
Matzarakis, Andreas; Fröhlich, Dominik
2015-04-01
The effect of weather on sport events is not well studied. It requires special attention if the event is taking place at a time and place with extreme weather situations. For the world soccer championship in Qatar (Doha 2022), human biometeorological analysis has been performed in order to identify the time of the year that is most suitable in terms of thermal comfort for visitors attending the event. The analysis is based on thermal indices like Physiologically Equivalent Temperature (PET). The results show that this kind of event may be not appropriate for visitors, if it is placed during months with extreme conditions. For Doha, this is the period from May to September, when conditions during a large majority of hours of the day cause strong heat stress for the visitors. A more appropriate time would be the months November to February, when thermally comfortable conditions are much more frequent. The methods applied here can quantify the thermal conditions and show limitations and possibilities for specific events and locations.
Assessing dry weather flow contribution in TSS and COD storm events loads in combined sewer systems.
Métadier, M; Bertrand-Krajewski, J L
2011-01-01
Continuous high resolution long term turbidity measurements along with continuous discharge measurements are now recognised as an appropriate technique for the estimation of in sewer total suspended solids (TSS) and Chemical Oxygen Demand (COD) loads during storm events. In the combined system of the Ecully urban catchment (Lyon, France), this technique is implemented since 2003, with more than 200 storm events monitored. This paper presents a method for the estimation of the dry weather (DW) contribution to measured total TSS and COD event loads with special attention devoted to uncertainties assessment. The method accounts for the dynamics of both discharge and turbidity time series at two minutes time step. The study is based on 180 DW days monitored in 2007-2008. Three distinct classes of DW days were evidenced. Variability analysis and quantification showed that no seasonal effect and no trend over the year were detectable. The law of propagation of uncertainties is applicable for uncertainties estimation. The method has then been applied to all measured storm events. This study confirms the interest of long term continuous discharge and turbidity time series in sewer systems, especially in the perspective of wet weather quality modelling.
Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.
2015-01-01
Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real-time projection of the final analysis time during a trial, or the overall analysis time during a trial collaborative such as IDEA, has practical implications for trial feasibility when these projections are translated into additional time and resources required. PMID:26989447
Risk analysis of chemical, biological, or radionuclear threats: implications for food security.
Mohtadi, Hamid; Murshid, Antu Panini
2009-09-01
If the food sector is attacked, the likely agents will be chemical, biological, or radionuclear (CBRN). We compiled a database of international terrorist/criminal activity involving such agents. Based on these data, we calculate the likelihood of a catastrophic event using extreme value methods. At the present, the probability of an event leading to 5,000 casualties (fatalities and injuries) is between 0.1 and 0.3. However, pronounced, nonstationary patterns within our data suggest that the "reoccurrence period" for such attacks is decreasing every year. Similarly, disturbing trends are evident in a broader data set, which is nonspecific as to the methods or means of attack. While at the present the likelihood of CBRN events is quite low, given an attack, the probability that it involves CBRN agents increases with the number of casualties. This is consistent with evidence of "heavy tails" in the distribution of casualties arising from CBRN events.
Event-Triggered Adaptive Dynamic Programming for Continuous-Time Systems With Control Constraints.
Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo
2016-08-31
In this paper, an event-triggered near optimal control structure is developed for nonlinear continuous-time systems with control constraints. Due to the saturating actuators, a nonquadratic cost function is introduced and the Hamilton-Jacobi-Bellman (HJB) equation for constrained nonlinear continuous-time systems is formulated. In order to solve the HJB equation, an actor-critic framework is presented. The critic network is used to approximate the cost function and the action network is used to estimate the optimal control law. In addition, in the proposed method, the control signal is transmitted in an aperiodic manner to reduce the computational and the transmission cost. Both the networks are only updated at the trigger instants decided by the event-triggered condition. Detailed Lyapunov analysis is provided to guarantee that the closed-loop event-triggered system is ultimately bounded. Three case studies are used to demonstrate the effectiveness of the proposed method.
Analysis and visualization of single-trial event-related potentials
NASA Technical Reports Server (NTRS)
Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.
2001-01-01
In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image visualization to the analysis of sets of single trials from event-related EEG (or MEG) experiments can increase the information available from ERP (or ERF) data. Copyright 2001 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.; Dercas, N.; Londra, P. A.
2009-01-01
The Soil Conservation Service Curve Number (SCS-CN) method is widely used for predicting direct runoff volume for a given rainfall event. The applicability of the SCS-CN method and the runoff generation mechanism were thoroughly analysed in a Mediterranean experimental watershed in Greece. The region is characterized by a Mediterranean semi-arid climate. A detailed land cover and soil survey using remote sensing and GIS techniques, showed that the watershed is dominated by coarse soils with high hydraulic conductivities, whereas a smaller part is covered with medium textured soils and impervious surfaces. The analysis indicated that the SCS-CN method fails to predict runoff for the storm events studied, and that there is a strong correlation between the CN values obtained from measured runoff and the rainfall depth. The hypothesis that this correlation could be attributed to the existence of an impermeable part in a very permeable watershed was examined in depth, by developing a numerical simulation water flow model for predicting surface runoff generated from each of the three soil types of the watershed. Numerical runs were performed using the HYDRUS-1D code. The results support the validity of this hypothesis for most of the events examined where the linear runoff formula provides better results than the SCS-CN method. The runoff coefficient of this formula can be taken equal to the percentage of the impervious area. However, the linear formula should be applied with caution in case of extreme events with very high rainfall intensities. In this case, the medium textured soils may significantly contribute to the total runoff and the linear formula may significantly underestimate the runoff produced.
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.; Dercas, N.; Londra, P. A.
2009-05-01
The Soil Conservation Service Curve Number (SCS-CN) method is widely used for predicting direct runoff volume for a given rainfall event. The applicability of the SCS-CN method and the direct runoff generation mechanism were thoroughly analysed in a Mediterranean experimental watershed in Greece. The region is characterized by a Mediterranean semi-arid climate. A detailed land cover and soil survey using remote sensing and GIS techniques, showed that the watershed is dominated by coarse soils with high hydraulic conductivities, whereas a smaller part is covered with medium textured soils and impervious surfaces. The analysis indicated that the SCS-CN method fails to predict runoff for the storm events studied, and that there is a strong correlation between the CN values obtained from measured runoff and the rainfall depth. The hypothesis that this correlation could be attributed to the existence of an impermeable part in a very permeable watershed was examined in depth, by developing a numerical simulation water flow model for predicting surface runoff generated from each of the three soil types of the watershed. Numerical runs were performed using the HYDRUS-1D code. The results support the validity of this hypothesis for most of the events examined where the linear runoff formula provides better results than the SCS-CN method. The runoff coefficient of this formula can be taken equal to the percentage of the impervious area. However, the linear formula should be applied with caution in case of extreme events with very high rainfall intensities. In this case, the medium textured soils may significantly contribute to the total runoff and the linear formula may significantly underestimate the runoff produced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, James J.; Kreuzer, Helen W.; Carman, April J.
Acid scavengers are frequently used as stabilizer compounds in a variety of applications. When used to stabilize volatile compounds such as nerve agents, the lower volatility and higher stability of acid scavengers make them more persistent in a post-event forensic setting. We are employing compound-specific stable isotope analysis of the carbon, nitrogen, and hydrogen components of three acid scavenging compounds (N,N-diethylaniline, tributylamine, and triethylamine) as a tool for distinguishing between different samples of the stabilizers. Combined analysis of three stable isotopes in these samples improves the technique’s resolving potential, enhancing sample matching capabilities. The compound specific methods developed here canmore » be applied to instances where these compounds are not pure, such as when mixed with an agent or when found as a residue at an event site. Effective sample matching can be crucial for linking compounds at multiple event sites or linking a supply inventory to an event.« less
Decision Trajectories in Dementia Care Networks: Decisions and Related Key Events.
Groen-van de Ven, Leontine; Smits, Carolien; Oldewarris, Karen; Span, Marijke; Jukema, Jan; Eefsting, Jan; Vernooij-Dassen, Myrra
2017-10-01
This prospective multiperspective study provides insight into the decision trajectories of people with dementia by studying the decisions made and related key events. This study includes three waves of interviews, conducted between July 2010 and July 2012, with 113 purposefully selected respondents (people with beginning to advanced stages of dementia and their informal and professional caregivers) completed in 12 months (285 interviews). Our multilayered qualitative analysis consists of content analysis, timeline methods, and constant comparison. Four decision themes emerged-managing daily life, arranging support, community living, and preparing for the future. Eight key events delineate the decision trajectories of people with dementia. Decisions and key events differ between people with dementia living alone and living with a caregiver. Our study clarifies that decisions relate not only to the disease but to living with the dementia. Individual differences in decision content and sequence may effect shared decision-making and advance care planning.
NASA Astrophysics Data System (ADS)
Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne
2017-04-01
The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine background site (Sonnblick Observatory, Saharan Dust Event from April 2016). Further, chemical imaging of biological aerosol constituents of an autumnal pollen breakout in Vienna, with background samples from nearby locations from November 2016 will demonstrate the advantages of the chemical imaging approach. Additionally, the chemical fingerprint of an exceptional air pollution event from a local emission source, caused by the pull down process of a building in Vienna will unravel the needs for multisensor imaging, especially the combinational access. Obtained chemical images will be correlated to bulk analytical results. Benefits of the overall methodical access by combining bulk analytics and combined chemical imaging of exceptional episodic air pollution events will be discussed.
NASA Astrophysics Data System (ADS)
Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi
2015-04-01
Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations from a variety of CMIP5 model ensembles. Here, we present results for the UK 2013/14 winter floods as proof of concept and we show validation and testing results that demonstrate the robustness of our method. We also revisit the record temperatures over Europe in 2014 and present a detailed analysis of this attribution exercise as it is one of the events to demonstrate that we can make a sensible statement of how the odds for such a year to occur have changed while it still unfolds.
NASA Astrophysics Data System (ADS)
Letort, Jean; Guilbert, Jocelyn; Cotton, Fabrice; Bondár, István; Cano, Yoann; Vergoz, Julien
2015-06-01
The depth of an earthquake is difficult to estimate because of the trade-off between depth and origin time estimations, and because it can be biased by lateral Earth heterogeneities. To face this challenge, we have developed a new, blind and fully automatic teleseismic depth analysis. The results of this new method do not depend on epistemic uncertainties due to depth-phase picking and identification. The method consists of a modification of the cepstral analysis from Letort et al. and Bonner et al., which aims to detect surface reflected (pP, sP) waves in a signal at teleseismic distances (30°-90°) through the study of the spectral holes in the shape of the signal spectrum. The ability of our automatic method to improve depth estimations is shown by relocation of the recent moderate seismicity of the Guerrero subduction area (Mexico). We have therefore estimated the depth of 152 events using teleseismic data from the IRIS stations and arrays. One advantage of this method is that it can be applied for single stations (from IRIS) as well as for classical arrays. In the Guerrero area, our new cepstral analysis efficiently clusters event locations and provides an improved view of the geometry of the subduction. Moreover, we have also validated our method through relocation of the same events using the new International Seismological Centre (ISC)-locator algorithm, as well as comparing our cepstral depths with the available Harvard-Centroid Moment Tensor (CMT) solutions and the three available ground thrust (GT5) events (where lateral localization is assumed to be well constrained with uncertainty <5 km) for this area. These comparisons indicate an overestimation of focal depths in the ISC catalogue for deeper parts of the subduction, and they show a systematic bias between the estimated cepstral depths and the ISC-locator depths. Using information from the CMT catalogue relating to the predominant focal mechanism for this area, this bias can be explained as a misidentification of sP phases by pP phases, which shows the greater interest for the use of this new automatic cepstral analysis, as it is less sensitive to phase identification.
NASA Astrophysics Data System (ADS)
Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming
2017-07-01
Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.
Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng
2011-10-17
Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remainsmore » mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.« less
Automatic classification of seismic events within a regional seismograph network
NASA Astrophysics Data System (ADS)
Tiira, Timo; Kortström, Jari; Uski, Marja
2015-04-01
A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
NASA Astrophysics Data System (ADS)
Muszynski, G.; Kashinath, K.; Wehner, M. F.; Prabhat, M.; Kurlin, V.
2017-12-01
We investigate novel approaches to detecting, classifying and characterizing extreme weather events, such as atmospheric rivers (ARs), in large high-dimensional climate datasets. ARs are narrow filaments of concentrated water vapour in the atmosphere that bring much of the precipitation in many mid-latitude regions. The precipitation associated with ARs is also responsible for major flooding events in many coastal regions of the world, including the west coast of the United States and western Europe. In this study we combine ideas from Topological Data Analysis (TDA) with Machine Learning (ML) for detecting, classifying and characterizing extreme weather events, like ARs. TDA is a new field that sits at the interface between topology and computer science, that studies "shape" - hidden topological structure - in raw data. It has been applied successfully in many areas of applied sciences, including complex networks, signal processing and image recognition. Using TDA we provide ARs with a shape characteristic as a new feature descriptor for the task of AR classification. In particular, we track the change in topology in precipitable water (integrated water vapour) fields using the Union-Find algorithm. We use the generated feature descriptors with ML classifiers to establish reliability and classification performance of our approach. We utilize the parallel toolkit for extreme climate events analysis (TECA: Petascale Pattern Recognition for Climate Science, Prabhat et al., Computer Analysis of Images and Patterns, 2015) for comparison (it is assumed that events identified by TECA is ground truth). Preliminary results indicate that our approach brings new insight into the study of ARs and provides quantitative information about the relevance of topological feature descriptors in analyses of a large climate datasets. We illustrate this method on climate model output and NCEP reanalysis datasets. Further, our method outperforms existing methods on detection and classification of ARs. This work illustrates that TDA combined with ML may provide a uniquely powerful approach for detection, classification and characterization of extreme weather phenomena.
Lu, Yan; Li, Tao
2014-03-01
To explore the effect of Chinese drugs for activating blood circulation and removing blood stasis (CDABCRBS) on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events. By using open and control method, effect of 4 groups of platelet antagonists, platelet antagonists + CDABCRBS, platelet antagonists +atorvastatin, platelet antagonists +atorvastatin +CDABCRBS on carotid atherosclerotic plaque and long-term ischemic cerebrovascular events of 90 cerebral infarction patients were analyzed. Through survival analysis, there was no statistical difference in the effect of the 4 interventions on the variation of carotid stenosis rates or ischemic cerebrovascular events (P > 0.05). The occurrence of ischemic cerebrovascular events could be postponed by about 4 months in those treated with platelet antagonists + CDABCRBS and platelet antagonists + atorvastatin +CDABCRBS. By multivariate Logistic analysis, age, hypertension, and clopidogrel were associated with stenosis of extracranial carotid arteries (P <0.05). Age, diabetes, aspirin, clopidogrel, CDABCRBS were correlated with cerebrovascular accidents (P < 0.05). Whether or not accompanied with hypertension is an influential factor for carotid stenosis, but it does not affect the occurrence of ischemic cerebrovascular events. CDABCRBS could effectively prolong the occurrence time of ischemic cerebrovascular events.
Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul
2008-01-01
AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870
Homaeinezhad, M R; Erfanianmoshiri-Nejad, M; Naseri, H
2014-01-01
The goal of this study is to introduce a simple, standard and safe procedure to detect and to delineate P and T waves of the electrocardiogram (ECG) signal in real conditions. The proposed method consists of four major steps: (1) a secure QRS detection and delineation algorithm, (2) a pattern recognition algorithm designed for distinguishing various ECG clusters which take place between consecutive R-waves, (3) extracting template of the dominant events of each cluster waveform and (4) application of the correlation analysis in order to delineate automatically the P- and T-waves in noisy conditions. The performance characteristics of the proposed P and T detection-delineation algorithm are evaluated versus various ECG signals whose qualities are altered from the best to the worst cases based on the random-walk noise theory. Also, the method is applied to the MIT-BIH Arrhythmia and the QT databases for comparing some parts of its performance characteristics with a number of P and T detection-delineation algorithms. The conducted evaluations indicate that in a signal with low quality value of about 0.6, the proposed method detects the P and T events with sensitivity Se=85% and positive predictive value of P+=89%, respectively. In addition, at the same quality, the average delineation errors associated with those ECG events are 45 and 63ms, respectively. Stable delineation error, high detection accuracy and high noise tolerance were the most important aspects considered during development of the proposed method. © 2013 Elsevier Ltd. All rights reserved.
Whiteley, William N; Adams, Harold P; Bath, Philip MW; Berge, Eivind; Sandset, Per Morten; Dennis, Martin; Murray, Gordon D; Wong, Ka-Sing Lawrence; Sandercock, Peter AG
2013-01-01
Summary Background Many international guidelines on the prevention of venous thromboembolism recommend targeting heparin treatment at patients with stroke who have a high risk of venous thrombotic events or a low risk of haemorrhagic events. We sought to identify reliable methods to target anticoagulant treatment and so improve the chance of avoiding death or dependence after stroke. Methods We obtained individual patient data from the five largest randomised controlled trials in acute ischaemic stroke that compared heparins (unfractionated heparin, heparinoids, or low-molecular-weight heparin) with aspirin or placebo. We developed and evaluated statistical models for the prediction of thrombotic events (myocardial infarction, stroke, deep vein thrombosis, or pulmonary embolism) and haemorrhagic events (symptomatic intracranial or significant extracranial) in the first 14 days after stroke. We calculated the absolute risk difference for the outcome “dead or dependent” in patients grouped by quartiles of predicted risk of thrombotic and haemorrhagic events with random effect meta-analysis. Findings Patients with ischaemic stroke who were of advanced age, had increased neurological impairment, or had atrial fibrillation had a high risk of both thrombotic and haemorrhagic events after stroke. Additionally, patients with CT-visible evidence of recent cerebral ischaemia were at increased risk of thrombotic events. In evaluation datasets, the area under a receiver operating curve for prediction models for thrombotic events was 0·63 (95% CI 0·59–0·67) and for haemorrhagic events was 0·60 (0·55–0·64). We found no evidence that the net benefit from heparins increased with either increasing risk of thrombotic events or decreasing risk of haemorrhagic events. Interpretation There was no evidence that patients with ischaemic stroke who were at higher risk of thrombotic events or lower risk of haemorrhagic events benefited from heparins. We were therefore unable to define a targeted approach to select the patients who would benefit from treatment with early anticoagulant therapy. We recommend that guidelines for routine or selective use of heparin in stroke should be revised. Funding MRC. PMID:23642343
Alessi, Edward J.; Martin, James I.; Gyamerah, Akua; Meyer, Ilan H.
2013-01-01
This mixed-methods study examined associations between prejudice events and posttraumatic stress disorder (PTSD) among 382 lesbians, gays, and bisexuals (LGB) and 126 heterosexuals. Using the Composite International Diagnostic Interview, we assessed PTSD but relaxed Criterion A1, that is, allowed prejudice events that did not involve threat to life or physical integrity to also qualify as traumatic. First, we tested whether exposure to prejudice events differed with respect to sexual orientation and race. White LGBs were more likely than White heterosexuals to encounter a prejudice event, but Black and Latino LGBs were no more likely than White LGBs to experience a prejudice event. Second, we used qualitative analysis to examine the prejudice events that precipitated relaxed Criterion A1 PTSD among 8 participants. Two specific themes emerged: the need to make major changes and compromised sense of safety and security following exposure to the prejudice event. PMID:24348008
1992-01-01
cognitive function. For example. physiological methods allow for visual sensitivity measurements in infants and children with about the same level of...potential (ERP), the event-related magnetic field (ERF), and pupillometry . Where possible, we cite specific experiments that deal with display or stimulus...technical barrier preventing the application of these methods to the analysis of human performance with color displays. Pupillometry . The pupillary
Reliability computation using fault tree analysis
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
ERIC Educational Resources Information Center
Gross, Jacob P. K.; Torres, Vasti
2010-01-01
Using a competing risks event history model this study explores the effects of differentiated forms of financial aid on the postsecondary enrollment patterns of Latino college students in Indiana. Much of the prior research on financial aid has employed cross-sectional methods, which assume that the effects of aid do not vary across time. This…
Microlensing observations rapid search for exoplanets: MORSE code for GPUs
NASA Astrophysics Data System (ADS)
McDougall, Alistair; Albrow, Michael D.
2016-02-01
The rapid analysis of ongoing gravitational microlensing events has been integral to the successful detection and characterization of cool planets orbiting low-mass stars in the Galaxy. In this paper, we present an implementation of search and fit techniques on graphical processing unit (GPU) hardware. The method allows for the rapid identification of candidate planetary microlensing events and their subsequent follow-up for detailed characterization.
Automated defect spatial signature analysis for semiconductor manufacturing process
Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed
1999-01-01
An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.
ERIC Educational Resources Information Center
Kim, Hongsoo; Capezuti, Elizabeth; Kovner, Christine; Zhao, Zhonglin; Boockvar, Kenneth
2010-01-01
Purpose of the Study: To examine the effects of the present on admission (POA) indicator on the prevalence of and factors associated with postsurgical adverse events in older patients. Design and Methods: This is a secondary data analysis of 82,898 surgical patients aged 65 years or older in 252 acute care hospitals in California in 2004. Four…
Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.
Tholey, Andreas; Becker, Alexander
2017-11-01
Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.
An analysis of high-impact, low-predictive skill severe weather events in the northeast U.S
NASA Astrophysics Data System (ADS)
Vaughan, Matthew T.
An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather events with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill events between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather events with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill events are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill events. Type 1 events were found to occur in every year of the dataset with an average of 6 events per year. Type 2 events occur less frequently and are more common in the earlier half of the study period. An event-centered composite analysis is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather events with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact events with less than 31-kt shear have lower threat scores than high-impact events with higher shear values. Case study analysis of type 1 events suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.
Cost analysis of adjustments of the epidemiological surveillance system to mass gatherings.
Zieliński, Andrzej
2011-01-01
The article deals with the problem of economical analysis of public health activities at mass gatherings. After presentation of elementary review of basic economical approaches to cost analysis author tries to analyze applicability of those methods to planning of mass gatherings. Difficulties in comparability of different events and lack of the outcome data at the stage of planning make most of the economic approaches unsuitable to application at the planning stage. Even applicability of cost minimization analysis may be limited to comparison of predicted costs of preconceived standards of epidemiological surveillance. Cost effectiveness performed ex post after the event when both costs and obtained effects are known, may bring more information for future selection of most effective procedures.
Evidential Networks for Fault Tree Analysis with Imprecise Knowledge
NASA Astrophysics Data System (ADS)
Yang, Jianping; Huang, Hong-Zhong; Liu, Yu; Li, Yan-Feng
2012-06-01
Fault tree analysis (FTA), as one of the powerful tools in reliability engineering, has been widely used to enhance system quality attributes. In most fault tree analyses, precise values are adopted to represent the probabilities of occurrence of those events. Due to the lack of sufficient data or imprecision of existing data at the early stage of product design, it is often difficult to accurately estimate the failure rates of individual events or the probabilities of occurrence of the events. Therefore, such imprecision and uncertainty need to be taken into account in reliability analysis. In this paper, the evidential networks (EN) are employed to quantify and propagate the aforementioned uncertainty and imprecision in fault tree analysis. The detailed conversion processes of some logic gates to EN are described in fault tree (FT). The figures of the logic gates and the converted equivalent EN, together with the associated truth tables and the conditional belief mass tables, are also presented in this work. The new epistemic importance is proposed to describe the effect of ignorance degree of event. The fault tree of an aircraft engine damaged by oil filter plugs is presented to demonstrate the proposed method.
Nonparametric methods for drought severity estimation at ungauged sites
NASA Astrophysics Data System (ADS)
Sadri, S.; Burn, D. H.
2012-12-01
The objective in frequency analysis is, given extreme events such as drought severity or duration, to estimate the relationship between that event and the associated return periods at a catchment. Neural networks and other artificial intelligence approaches in function estimation and regression analysis are relatively new techniques in engineering, providing an attractive alternative to traditional statistical models. There are, however, few applications of neural networks and support vector machines in the area of severity quantile estimation for drought frequency analysis. In this paper, we compare three methods for this task: multiple linear regression, radial basis function neural networks, and least squares support vector regression (LS-SVR). The area selected for this study includes 32 catchments in the Canadian Prairies. From each catchment drought severities are extracted and fitted to a Pearson type III distribution, which act as observed values. For each method-duration pair, we use a jackknife algorithm to produce estimated values at each site. The results from these three approaches are compared and analyzed, and it is found that LS-SVR provides the best quantile estimates and extrapolating capacity.
Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.
Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter; Ghali, William A; Marshall, Deborah A
2018-01-01
Kaplan-Meier survival analysis overestimates cumulative incidence in competing risks (CRs) settings. The extent of overestimation (or its clinical significance) has been questioned, and CRs methods are infrequently used. This meta-analysis compares the Kaplan-Meier method to the cumulative incidence function (CIF), a CRs method. We searched MEDLINE, EMBASE, BIOSIS Previews, Web of Science (1992-2016), and article bibliographies for studies estimating cumulative incidence using the Kaplan-Meier method and CIF. For studies with sufficient data, we calculated pooled risk ratios (RRs) comparing Kaplan-Meier and CIF estimates using DerSimonian and Laird random effects models. We performed stratified meta-analyses by clinical area, rate of CRs (CRs/events of interest), and follow-up time. Of 2,192 identified abstracts, we included 77 studies in the systematic review and meta-analyzed 55. The pooled RR demonstrated the Kaplan-Meier estimate was 1.41 [95% confidence interval (CI): 1.36, 1.47] times higher than the CIF. Overestimation was highest among studies with high rates of CRs [RR = 2.36 (95% CI: 1.79, 3.12)], studies related to hepatology [RR = 2.60 (95% CI: 2.12, 3.19)], and obstetrics and gynecology [RR = 1.84 (95% CI: 1.52, 2.23)]. The Kaplan-Meier method overestimated the cumulative incidence across 10 clinical areas. Using CRs methods will ensure accurate results inform clinical and policy decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Ruckart, Perri Z; Wattigney, Wendy A; Kaye, Wendy E
2004-01-01
Background Releases of hazardous materials can cause substantial morbidity and mortality. To reduce and prevent the public health consequences (victims or evacuations) from uncontrolled or illegally released hazardous substances, a more comprehensive analysis is needed to determine risk factors for hazardous materials incidents. Methods Hazardous Substances Emergency Events Surveillance (HSEES) data from 1996 through 2001 were analyzed using bivariate and multiple logistic regression. Fixed-facility and transportation-related events were analyzed separately. Results For fixed-facility events, 2,327 (8%) resulted in at least one victim and 2,844 (10%) involved ordered evacuations. For transportation-related events, 759 (8%) resulted in at least one victim, and 405 (4%) caused evacuation orders. Fire and/or explosion were the strongest risk factors for events involving either victims or evacuations. Stratified analysis of fixed-facility events involving victims showed a strong association for acid releases in the agriculture, forestry, and fisheries industry. Chlorine releases in fixed-facility events resulted in victims and evacuations in more industry categories than any other substance. Conclusions Outreach efforts should focus on preventing and preparing for fires and explosions, acid releases in the agricultural industry, and chlorine releases in fixed facilities. PMID:15496226
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
Learn about the EPA guide (Selected Analytical Methods for Environmental Remediation and Recovery) that helps labs around the country quickly select the appropriate environmental testing and analysis methods to use after a wide-scale chemical event
40 CFR 60.2120 - Affirmative Defense for Exceedance of an Emission Limit During Malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Standards of Performance for Commercial and Industrial Solid Waste Incineration Units Emission Limitations... monitoring methods and engineering judgment, the amount of excess emissions that were the result of the... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
Predictability of rogue events.
Birkholz, Simon; Brée, Carsten; Demircan, Ayhan; Steinmeyer, Günter
2015-05-29
Using experimental data from three different rogue wave supporting systems, determinism, and predictability of the underlying dynamics are evaluated with methods of nonlinear time series analysis. We included original records from the Draupner platform in the North Sea as well as time series from two optical systems in our analysis. One of the latter was measured in the infrared tail of optical fiber supercontinua, the other in the fluence profiles of multifilaments. All three data sets exhibit extreme-value statistics and exceed the significant wave height in the respective system by a factor larger than 2. Nonlinear time series analysis indicates a different degree of determinism in the systems. The optical fiber scenario is found to be driven by quantum noise whereas rogue waves emerge as a consequence of turbulence in the others. With the large number of rogue events observed in the multifilament system, we can systematically explore the predictability of such events in a turbulent system. We observe that rogue events do not necessarily appear without a warning, but are often preceded by a short phase of relative order. This surprising finding sheds some new light on the fascinating phenomenon of rogue waves.
Dimensional Representation and Gradient Boosting for Seismic Event Classification
NASA Astrophysics Data System (ADS)
Semmelmayer, F. C.; Kappedal, R. D.; Magana-Zook, S. A.
2017-12-01
In this research, we conducted experiments of representational structures on 5009 seismic signals with the intent of finding a method to classify signals as either an explosion or an earthquake in an automated fashion. We also applied a gradient boosted classifier. While perfect classification was not attained (approximately 88% was our best model), some cases demonstrate that many events can be filtered out as very high probability being explosions or earthquakes, diminishing subject-matter experts'(SME) workload for first stage analysis. It is our hope that these methods can be refined, further increasing the classification probability.
WFIRST: Microlensing Analysis Data Challenge
NASA Astrophysics Data System (ADS)
Street, Rachel; WFIRST Microlensing Science Investigation Team
2018-01-01
WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.
NASA Astrophysics Data System (ADS)
Matsuda, Takashi S.; Nakamura, Takuji; Shiokawa, Kazuo; Tsutsumi, Masaki; Suzuki, Hidehiko; Ejiri, Mitsumu K.; Taguchi, Makoto
Atmospheric gravity waves (AGWs), which are generated in the lower atmosphere, transport significant amount of energy and momentum into the mesosphere and lower thermosphere and cause the mean wind accelerations in the mesosphere. This momentum deposit drives the general circulation and affects the temperature structure. Among many parameters to characterize AGWs, horizontal phase velocity is very important to discuss the vertical propagation. Airglow imaging is a useful technique for investigating the horizontal structures of AGWs at around 90 km altitude. Recently, there are many reports about statistical characteristics of AGWs observed by airglow imaging. However, comparison of these results obtained at various locations is difficult because each research group uses its own method for extracting and analyzing AGW events. We have developed a new statistical analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow image data, so as to deal with huge amounts of imaging data obtained on different years and at various observation sites, without bias caused by different event extraction criteria for the observer. This method was applied to the data obtained at Syowa Station, Antarctica, in 2011 and compared with a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics. This comparison shows that our new method is adequate to deriving the horizontal phase velocity characteristics of AGWs observed by airglow imaging technique. We plan to apply this method to airglow imaging data observed at Syowa Station in 2002 and between 2008 and 2013, and also to the data observed at other stations in Antarctica (e.g. Rothera Station (67S, 68W) and Halley Station (75S, 26W)), in order to investigate the behavior of AGWs propagation direction and source distribution in the MLT region over Antarctica. In this presentation, we will report interim analysis result of the data at Syowa Station.
Risk-Stratified Imputation in Survival Analysis
Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George
2013-01-01
Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for categorical covariates, and may be sensitive to the width of the matching window if continuous covariates are used. Conclusions The use of the risk-stratified imputation should facilitate the analysis of many clinical trials, in which one group has a higher withdrawal rate that is related to treatment. PMID:23818434
Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.
Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung
2012-04-10
We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.
A systematic comparison of recurrent event models for application to composite endpoints.
Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine
2018-01-04
Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.
Pertinent anatomy and analysis for midface volumizing procedures.
Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome
2015-05-01
The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.
NASA Astrophysics Data System (ADS)
Hu, H.; Ge, Y. J.
2013-11-01
With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.
Nazarzadeh, Kimia; Arjunan, Sridhar P; Kumar, Dinesh K; Das, Debi Prasad
2016-08-01
In this study, we have analyzed the accelerometer data recorded during gait analysis of Parkinson disease patients for detecting freezing of gait (FOG) episodes. The proposed method filters the recordings for noise reduction of the leg movement changes and computes the wavelet coefficients to detect FOG events. Publicly available FOG database was used and the technique was evaluated using receiver operating characteristic (ROC) analysis. Results show a higher performance of the wavelet feature in discrimination of the FOG events from the background activity when compared with the existing technique.
NASA Astrophysics Data System (ADS)
Tramutoli, V.; Inan, S.; Jakowski, N.; Pulinets, S.; Romanov, A.; Filizzola, C.; Shagimuratov, I.; Pergola, N.; Genzano, N.; Lisi, M.; Alparslan, E.; Wilken, V.; Tsybulia, K.; Romanov, A.; Paciello, R.; Balasco, M.; Zakharenkova, I.; Ouzounov, D.; Papadopoulos, G. A.; Parrot, M.
2012-04-01
PRE-EARTHQUAKES (Processing Russian and European EARTH observations for earthQUAKE precursors Studies) EU-FP7 project is devoted to demonstrate - integrating different observational data, comparing and improving different data analysis methods - how it is possible to progressively increase reliability of short term seismic risk assessment. Three main testing area were selected (Italy, Turkey and Sakhalin ) in order to concentrate observations and integration efforts starting with a learning phase on selected event in the past devoted to identify the most suitable parameters, observations technologies, data analysis algorithms. To this aim events offering major possibilities (variety) of integration were particularly considered - Abruzzo EQ (April 6th 2009 Mw 6.3) for Italy, Elazig EQ (March 8th 2010 Mw 6.1) for Turkey and Nevelsk EQ (August 2nd 2007 Mw 6.2) for Sakhalin - without excluding other significant events occurred during 2011 like the ones of Tōhoku in Japan and Van in Turkey. For these events, different ground (80 radon and 29 spring water stations in Turkey region, 2 magneto-telluric in Italy) and satellite (18 different systems) based observations, 11 data analysis methods, for 7 measured parameters, have been compared and integrated. Results achieved by applying a validation/confutation approach devoted to evaluate the presence/absence of anomalous space-time transients in single and/or integrated observation time-series will be discussed also in comparison with results independently achieved by other authors.
Challenges of Guarantee-Time Bias
Giobbie-Hurder, Anita; Gelber, Richard D.; Regan, Meredith M.
2013-01-01
The potential for guarantee-time bias (GTB), also known as immortal time bias, exists whenever an analysis that is timed from enrollment or random assignment, such as disease-free or overall survival, is compared across groups defined by a classifying event occurring sometime during follow-up. The types of events associated with GTB are varied and may include the occurrence of objective disease response, onset of toxicity, or seroconversion. However, comparative analyses using these types of events as predictors are different from analyses using baseline characteristics that are specified completely before the occurrence of any outcome event. Recognizing the potential for GTB is not always straightforward, and it can be challenging to know when GTB is influencing the results of an analysis. This article defines GTB, provides examples of GTB from several published articles, and discusses three analytic techniques that can be used to remove the bias: conditional landmark analysis, extended Cox model, and inverse probability weighting. The strengths and limitations of each technique are presented. As an example, we explore the effect of bisphosphonate use on disease-free survival (DFS) using data from the BIG (Breast International Group) 1-98 randomized clinical trial. An analysis using a naive approach showed substantial benefit for patients who received bisphosphonate therapy. In contrast, analyses using the three methods known to remove GTB showed no statistical evidence of a reduction in risk of a DFS event with bisphosphonate therapy. PMID:23835712
Merging of rain gauge and radar data for urban hydrological modelling
NASA Astrophysics Data System (ADS)
Berndt, Christian; Haberlandt, Uwe
2015-04-01
Urban hydrological processes are generally characterised by short response times and therefore rainfall data with a high resolution in space and time are required for their modelling. In many smaller towns, no recordings of rainfall data exist within the urban catchment. Precipitation radar helps to provide extensive rainfall data with a temporal resolution of five minutes, but the rainfall amounts can be highly biased and hence the data should not be used directly as a model input. However, scientists proposed several methods for adjusting radar data to station measurements. This work tries to evaluate rainfall inputs for a hydrological model regarding the following two different applications: Dimensioning of urban drainage systems and analysis of single event flow. The input data used for this analysis can be divided into two groups: Methods, which rely on station data only (Nearest Neighbour Interpolation, Ordinary Kriging), and methods, which incorporate station as well as radar information (Conditional Merging, Bias correction of radar data based on quantile mapping with rain gauge recordings). Additionally, rainfall intensities that were directly obtained from radar reflectivities are used. A model of the urban catchment of the city of Brunswick (Lower Saxony, Germany) is utilised for the evaluation. First results show that radar data cannot help with the dimensioning task of sewer systems since rainfall amounts of convective events are often overestimated. Gauges in catchment proximity can provide more reliable rainfall extremes. Whether radar data can be helpful to simulate single event flow depends strongly on the data quality and thus on the selected event. Ordinary Kriging is often not suitable for the interpolation of rainfall data in urban hydrology. This technique induces a strong smoothing of rainfall fields and therefore a severe underestimation of rainfall intensities for convective events.
Dong, Huiru; Robison, Leslie L; Leisenring, Wendy M; Martin, Leah J; Armstrong, Gregory T; Yasui, Yutaka
2015-04-01
Cumulative incidence has been widely used to estimate the cumulative probability of developing an event of interest by a given time, in the presence of competing risks. When it is of interest to measure the total burden of recurrent events in a population, however, the cumulative incidence method is not appropriate because it considers only the first occurrence of the event of interest for each individual in the analysis: Subsequent occurrences are not included. Here, we discuss a straightforward and intuitive method termed "mean cumulative count," which reflects a summarization of all events that occur in the population by a given time, not just the first event for each subject. We explore the mathematical relationship between mean cumulative count and cumulative incidence. Detailed calculation of mean cumulative count is described by using a simple hypothetical example, and the computation code with an illustrative example is provided. Using follow-up data from January 1975 to August 2009 collected in the Childhood Cancer Survivor Study, we show applications of mean cumulative count and cumulative incidence for the outcome of subsequent neoplasms to demonstrate different but complementary information obtained from the 2 approaches and the specific utility of the former. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Arnaud, Nicolas; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelbergh, Stephane; Porter, Edward K.
2003-11-01
Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at a manageable rate) and a high detection efficiency. Therefore, the comparison of the two approaches is achieved using so-called receiving operating characteristics (ROC), giving the relationship between the false alarm rate and the detection efficiency for a given method. This paper investigates this question via Monte Carlo simulations, using the network model developed in a previous article. Its main conclusions are the following. First, a three-interferometer network such as Virgo-LIGO is found to be too small to reach good detection efficiencies at low false alarm rates: larger configurations are suitable to reach a confidence level high enough to validate as true GW a detected event. In addition, an efficient network must contain interferometers with comparable sensitivities: studying the three-interferometer LIGO network shows that the 2-km interferometer with half sensitivity leads to a strong reduction of performances as compared to a network of three interferometers with full sensitivity. Finally, it is shown that coherent analyses are feasible for burst searches and are clearly more efficient than coincidence strategies. Therefore, developing such methods should be an important goal of a worldwide collaborative data analysis.
Categorizing accident sequences in the external radiotherapy for risk analysis
2013-01-01
Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005
Legrand, Helen; Pihlsgård, Mats; Nordell, Eva; Elmståhl, Sölve
2015-08-01
Few studies on fall risk factors use long-recommended methods for analysis of recurrent events. Previous falls are the biggest risk factor for future falls, but few fall studies focus on the youngest-old. This study's objective was to apply Cox regression for recurrent events to identify factors associated with injurious falls in the youngest-old. Participants were community-dwelling residents of southern Sweden (n = 1,133), aged 59-67 at baseline (median 61.2), from the youngest cohorts of the larger Good Aging in Skåne (GÅS) study. Exposure variable data were collected from baseline study visits and medical records. Injurious falls, defined as emergency, inpatient, or specialist visits associated with ICD-10 fall codes during the follow-up period (2001-2011), were gathered from national and regional registries. Analysis was conducted using time to event Cox Regression for recurrent events. A majority (77.1 %) of injurious falls caused serious injuries such as fractures and open wounds. Exposure to nervous system medications [hazard ratio (HR) 1.40, 95 % confidence interval (CI) 1.03-1.89], central nervous system disease (HR 1.79, CI 1.18-2.70), and previous injurious fall(s) (HR 2.00, CI 1.50-2.68) were associated with increased hazard of injurious fall. Regression for recurrent events is feasible with typical falls' study data. The association of certain exposures with increased hazard of injurious falls begins earlier than previously studied. Different patterns of risk factors by age can provide insight into the progression of frailty. Tailored fall prevention screening and intervention may be of value in populations younger than those traditionally screened.
Holmskov, Mathilde; Storebø, Ole Jakob; Moreira-Maia, Carlos R.; Ramstad, Erica; Magnusson, Frederik Løgstrup; Krogh, Helle B.; Groth, Camilla; Gillies, Donna; Zwi, Morris; Skoog, Maria; Gluud, Christian; Simonsen, Erik
2017-01-01
Objectives To study in more depth the relationship between type, dose, or duration of methylphenidate offered to children and adolescents with attention deficit hyperactivity disorder and their risks of gastrointestinal adverse events based on our Cochrane systematic review. Methods and findings We use data from our review including 185 randomised clinical trials. Randomised parallel-group trials and cross-over trials reporting gastrointestinal adverse events associated with methylphenidate were included. Data were extracted and quality assessed according to Cochrane guidelines. Data were summarised as risk ratios (RR) with 95% confidence intervals (CI) using the inverse variance method. Bias risks were assessed according to domains. Trial Sequential Analysis (TSA) was used to control random errors. Eighteen parallel group trials and 43 cross-over trials reported gastrointestinal adverse events. All trials were at high risk of bias. In parallel group trials, methylphenidate decreased appetite (RR 3.66, 95% CI 2.56 to 5.23) and weight (RR 3.89, 95% CI 1.43 to 10.59). In cross-over trials, methylphenidate increased abdominal pain (RR 1.61, 95% CI 1.27 to 2.04). We found no significant differences in the risk according to type, dose, or duration of administration. The required information size was achieved in three out of four outcomes. Conclusion Methylphenidate increases the risks of decreased appetite, weight loss, and abdominal pain in children and adolescents with attention deficit hyperactivity disorder. No differences in the risks of gastrointestinal adverse events according to type, dose, or duration of administration were found. PMID:28617801
NASA Astrophysics Data System (ADS)
Verdhora Ry, Rexha; Septyana, T.; Widiyantoro, S.; Nugraha, A. D.; Ardjuna, A.
2017-04-01
Microseismic monitoring and constraining its hypocenters in and around hydrocarbon reservoirs provides insight into induced deformation related to hydraulic fracturing. In this study, we used data from a single vertical array of sensors in a borehole, providing measures of arrival times and polarizations. Microseismic events are located using 1-D velocity models and arrival times of P- and S-waves. However, in the case of all the sensors being deployed in a near-vertical borehole, there is a high ambiguity in the source location. Herein, we applied a procedure using azimuth of P-wave particle motion to constrain and improve the source location. We used a dataset acquired during 1-day of fracture stimulation at a CBM field in Indonesia. We applied five steps of location procedure to investigate microseismic events induced by these hydraulic fracturing activities. First, arrival times for 1584 candidate events were manually picked. Then we refined the arrival times using energy ratio method to obtain high consistency picking. Using these arrival times, we estimated back-azimuth using P-wave polarization analysis. We also added the combination of polarities analysis to remove 180° ambiguity. In the end, we determined hypocenter locations using grid-search method that guided in the back-azimuth trace area to minimize the misfit function of arrival times. We have successfully removed the ambiguity and produced a good solution for hypocenter locations as indicated statistically by small RMS. Most of the events clusters highlight coherent structures around the treatment well site and revealed faults. The same procedure can be applied to various other cases such as microseismic monitoring in the field of geothermal and shale gas/oil exploration, also CCS (Carbon Capture and Storage) development.
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.
Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty
2011-10-01
The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2015-10-01
The paper proposes the evaluation of the technical performance of a regional landslide early warning system by means of an original approach, called EDuMaP method, comprising three successive steps: identification and analysis of the Events (E), i.e. landslide events and warning events derived from available landslides and warnings databases; definition and computation of a Duration Matrix (DuMa), whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model Performance (P) by means of performance criteria and indicators applied to the duration matrix. During the first step, the analyst takes into account the features of the warning model by means of ten input parameters, which are used to identify and classify landslide and warning events according to their spatial and temporal characteristics. In the second step, the analyst computes a time-based duration matrix having a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The proposed method is based on a framework clearly distinguishing between local and regional landslide early warning systems as well as among correlation laws, warning models and warning systems. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warnings data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Ren, Yilong; Wang, Yunpeng; Wu, Xinkai; Yu, Guizhen; Ding, Chuan
2016-10-01
Red light running (RLR) has become a major safety concern at signalized intersection. To prevent RLR related crashes, it is critical to identify the factors that significantly impact the drivers' behaviors of RLR, and to predict potential RLR in real time. In this research, 9-month's RLR events extracted from high-resolution traffic data collected by loop detectors from three signalized intersections were applied to identify the factors that significantly affect RLR behaviors. The data analysis indicated that occupancy time, time gap, used yellow time, time left to yellow start, whether the preceding vehicle runs through the intersection during yellow, and whether there is a vehicle passing through the intersection on the adjacent lane were significantly factors for RLR behaviors. Furthermore, due to the rare events nature of RLR, a modified rare events logistic regression model was developed for RLR prediction. The rare events logistic regression method has been applied in many fields for rare events studies and shows impressive performance, but so far none of previous research has applied this method to study RLR. The results showed that the rare events logistic regression model performed significantly better than the standard logistic regression model. More importantly, the proposed RLR prediction method is purely based on loop detector data collected from a single advance loop detector located 400 feet away from stop-bar. This brings great potential for future field applications of the proposed method since loops have been widely implemented in many intersections and can collect data in real time. This research is expected to contribute to the improvement of intersection safety significantly. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A
2017-11-01
Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.
2017-01-01
Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863
Blade loss transient dynamic analysis of turbomachinery
NASA Technical Reports Server (NTRS)
Stallone, M. J.; Gallardo, V.; Storace, A. F.; Bach, L. J.; Black, G.; Gaffney, E. F.
1982-01-01
This paper reports on work completed to develop an analytical method for predicting the transient non-linear response of a complete aircraft engine system due to the loss of a fan blade, and to validate the analysis by comparing the results against actual blade loss test data. The solution, which is based on the component element method, accounts for rotor-to-casing rubs, high damping and rapid deceleration rates associated with the blade loss event. A comparison of test results and predicted response show good agreement except for an initial overshoot spike not observed in test. The method is effective for analysis of large systems.
Microseismic event location by master-event waveform stacking
NASA Astrophysics Data System (ADS)
Grigoli, F.; Cesca, S.; Dahm, T.
2016-12-01
Waveform stacking location methods are nowadays extensively used to monitor induced seismicity monitoring assoiciated with several underground industrial activities such as Mining, Oil&Gas production and Geothermal energy exploitation. In the last decade a significant effort has been spent to develop or improve methodologies able to perform automated seismological analysis for weak events at a local scale. This effort was accompanied by the improvement of monitoring systems, resulting in an increasing number of large microseismicity catalogs. The analysis of microseismicity is challenging, because of the large number of recorded events often characterized by a low signal-to-noise ratio. A significant limitation of the traditional location approaches is that automated picking is often done on each seismogram individually, making little or no use of the coherency information between stations. In order to improve the performance of the traditional location methods, in the last year, alternative approaches have been proposed. These methods exploits the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. The main advantage of this methods relies on their robustness even when the recorded waveforms are very noisy. On the other hand, like any other location method, the location performance strongly depends on the accuracy of the available velocity model. When dealing with inaccurate velocity models, in fact, location results can be affected by large errors. Here we will introduce a new automated waveform stacking location method which is less dependent on the knowledge of the velocity model and presents several benefits, which improve the location accuracy: 1) it accounts for phase delays due to local site effects, e.g. surface topography or variable sediment thickness 2) theoretical velocity model are only used to estimate travel times within the source volume, and not along the whole source-sensor path. We finally compare the location results for both synthetics and real data with those obtained by using classical waveforms stacking approaches.
Winter maintenance performance measure.
DOT National Transportation Integrated Search
2016-01-01
The Winter Performance Index is a method of quantifying winter storm events and the DOTs response to them. : It is a valuable tool for evaluating the States maintenance practices, performing post-storm analysis, training : maintenance personnel...
40 CFR 60.2120 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Standards of Performance for Commercial and Industrial Solid Waste Incineration Units Emission Limitations... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.
Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong
2015-11-01
Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.
NASA Astrophysics Data System (ADS)
Omenzetter, Piotr; Brownjohn, James M. W.; Moyo, Pilate
2003-08-01
Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure. However, converting large amount of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure in Singapore and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localizing sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.
Patterns and Sequences: Interactive Exploration of Clickstreams to Understand Common Visitor Paths.
Liu, Zhicheng; Wang, Yang; Dontcheva, Mira; Hoffman, Matthew; Walker, Seth; Wilson, Alan
2017-01-01
Modern web clickstream data consists of long, high-dimensional sequences of multivariate events, making it difficult to analyze. Following the overarching principle that the visual interface should provide information about the dataset at multiple levels of granularity and allow users to easily navigate across these levels, we identify four levels of granularity in clickstream analysis: patterns, segments, sequences and events. We present an analytic pipeline consisting of three stages: pattern mining, pattern pruning and coordinated exploration between patterns and sequences. Based on this approach, we discuss properties of maximal sequential patterns, propose methods to reduce the number of patterns and describe design considerations for visualizing the extracted sequential patterns and the corresponding raw sequences. We demonstrate the viability of our approach through an analysis scenario and discuss the strengths and limitations of the methods based on user feedback.
Monitoring hydrofrac-induced seismicity by surface arrays - the DHM-Project Basel case study
NASA Astrophysics Data System (ADS)
Blascheck, P.; Häge, M.; Joswig, M.
2012-04-01
The method "nanoseismic monitoring" was applied during the hydraulic stimulation at the Deep-Heat-Mining-Project (DHM-Project) Basel. Two small arrays in a distance of 2.1 km and 4.8 km to the borehole recorded continuously for two days. During this time more than 2500 seismic events were detected. The method of the surface monitoring of induced seismicity was compared to the reference which the hydrofrac monitoring presented. The latter was conducted by a network of borehole seismometers by Geothermal Explorers Limited. Array processing provides a outlier resistant, graphical jack-knifing localization method which resulted in a average deviation towards the reference of 850 m. Additionally, by applying the relative localization master-event method, the NNW-SSE strike direction of the reference was confirmed. It was shown that, in order to successfully estimate the magnitude of completeness as well as the b-value at the event rate and detection sensibility present, 3 h segments of data are sufficient. This is supported by two segment out of over 13 h of evaluated data. These segments were chosen so that they represent a time during the high seismic noise during normal working hours in daytime as well as the minimum anthropogenic noise at night. The low signal-to-noise ratio was compensated by the application of a sonogram event detection as well as a coincidence analysis within each array. Sonograms allow by autoadaptive, non-linear filtering to enhance signals whose amplitudes are just above noise level. For these events the magnitude was determined by the master-event method, allowing to compute the magnitude of completeness by the entire-magnitude-range method provided by the ZMAP toolbox. Additionally, the b-values were determined and compared to the reference values. An introduction to the method of "nanoseismic monitoring" will be given as well as the comparison to reference data in the Basel case study.
Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments
NASA Astrophysics Data System (ADS)
Berk, Mario; Å pačková, Olga; Straub, Daniel
2017-12-01
The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.
NASA Astrophysics Data System (ADS)
Allison, Lesley; Hawkins, Ed; Woollings, Tim
2015-01-01
Many previous studies have shown that unforced climate model simulations exhibit decadal-scale fluctuations in the Atlantic meridional overturning circulation (AMOC), and that this variability can have impacts on surface climate fields. However, the robustness of these surface fingerprints across different models is less clear. Furthermore, with the potential for coupled feedbacks that may amplify or damp the response, it is not known whether the associated climate signals are linearly related to the strength of the AMOC changes, or if the fluctuation events exhibit nonlinear behaviour with respect to their strength or polarity. To explore these questions, we introduce an objective and flexible method for identifying the largest natural AMOC fluctuation events in multicentennial/multimillennial simulations of a variety of coupled climate models. The characteristics of the events are explored, including their magnitude, meridional coherence and spatial structure, as well as links with ocean heat transport and the horizontal circulation. The surface fingerprints in ocean temperature and salinity are examined, and compared with the results of linear regression analysis. It is found that the regressions generally provide a good indication of the surface changes associated with the largest AMOC events. However, there are some exceptions, including a nonlinear change in the atmospheric pressure signal, particularly at high latitudes, in HadCM3. Some asymmetries are also found between the changes associated with positive and negative AMOC events in the same model. Composite analysis suggests that there are signals that are robust across the largest AMOC events in each model, which provides reassurance that the surface changes associated with one particular event will be similar to those expected from regression analysis. However, large differences are found between the AMOC fingerprints in different models, which may hinder the prediction and attribution of such events in reality.
Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies
Uhlemann, Elisabeth
2018-01-01
Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications. PMID:29570676
Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.
Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos
2018-03-23
Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.
On event-based optical flow detection
Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko
2015-01-01
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470
Lu, Guo-Cai; Wei, Rui-Li
2012-01-01
Background Intravitreal anti-vascular endothelial growth factor (VEGF) monoclonal antibodies are used in ocular neovascular diseases. A consensus has emerged that intravenous anti-VEGF can increase the risk of arterial thromboembolic events. However, the role of intravitreal anti-VEGF in arterial thromboembolism is controversial. Therefore, we did a systematic review and meta-analysis to investigate the effects of intravitreal anti-VEGF on the risk of arterial thromboembolic events. Methods Electronic databases were searched to identify relevant randomized clinical trials comparing intravitreal anti-VEGF with controls. Criteria for inclusion in our meta-analysis included a study duration of no less than 12 months, the use of a randomized control group not receiving any intravitreal active agent, and the availability of outcome data for arterial thromboembolic events, myocardial infarction, cerebrovascular accidents, and vascular death. The risk ratios and 95% CIs were calculated using a fixed-effects or random-effects model, depending on the heterogeneity of the included studies. Results A total of 4942 patients with a variety of ocular neovascular diseases from 13 randomized controlled trials were identified and included for analysis. There was no significant difference between intravitreal anti-VEGF and control in the risk of all events, with risk ratios of 0.87 (95% CI, 0.64 to 1.19) for arterial thromboembolic events, 0.96 (95% CI, 0.55–1.68) for cerebrovascular accidents, 0.69 (95% CI 0.40–1.21) for myocardial infarctions, and 0.68 (95% CI, 0.37–1.27) for vascular death. Conclusions The strength evidence suggests that the intravitreal use of anti-VEGF antibodies is not associated with an increased risk of arterial thromboembolic events. PMID:22829940
Matoza, Robin S.; Shearer, Peter M.; Okubo, Paul G.
2016-01-01
Long-period (0.5–5 Hz, LP) seismicity has been recorded for decades in the summit region of Kı̄lauea Volcano, Hawai‘i, and is postulated as linked with the magma transport and shallow hydrothermal systems. To better characterize its spatiotemporal occurrence, we perform a systematic analysis of 49,030 seismic events occurring in the Kı̄lauea summit region from January 1986 to March 2009 recorded by the ∼50-station Hawaiian Volcano Observatory permanent network. We estimate 215,437 P wave spectra, considering all events on all stations, and use a station-averaged spectral metric to consistently classify LP and non-LP seismicity. We compute high-precision relative relocations for 5327 LP events (43% of all classified LP events) using waveform cross correlation and cluster analysis with 6.4 million event pairs, combined with the source-specific station term method. The majority of intermediate-depth (5–15 km) LPs collapse to a compact volume, with remarkable source location stability over 23 years indicating a source process controlled by geological or conduit structure.
Pseudo and conditional score approach to joint analysis of current count and current status data.
Wen, Chi-Chung; Chen, Yi-Hau
2018-04-17
We develop a joint analysis approach for recurrent and nonrecurrent event processes subject to case I interval censorship, which are also known in literature as current count and current status data, respectively. We use a shared frailty to link the recurrent and nonrecurrent event processes, while leaving the distribution of the frailty fully unspecified. Conditional on the frailty, the recurrent event is assumed to follow a nonhomogeneous Poisson process, and the mean function of the recurrent event and the survival function of the nonrecurrent event are assumed to follow some general form of semiparametric transformation models. Estimation of the models is based on the pseudo-likelihood and the conditional score techniques. The resulting estimators for the regression parameters and the unspecified baseline functions are shown to be consistent with rates of square and cubic roots of the sample size, respectively. Asymptotic normality with closed-form asymptotic variance is derived for the estimator of the regression parameters. We apply the proposed method to a fracture-osteoporosis survey data to identify risk factors jointly for fracture and osteoporosis in elders, while accounting for association between the two events within a subject. © 2018, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Pinto, Victor A.; Kim, Hee-Jeong; Lyons, Larry R.; Bortnik, Jacob
2018-02-01
We have identified 61 relativistic electron enhancement events and 21 relativistic electron persistent depletion events during 1996 to 2006 from the Geostationary Operational Environmental Satellite (GOES) 8 and 10 using data from the Energetic Particle Sensor (EPS) >2 MeV fluxes. We then performed a superposed epoch time analysis of the events to find the characteristic solar wind parameters that determine the occurrence of such events, using the OMNI database. We found that there are clear differences between the enhancement events and the persistent depletion events, and we used these to establish a set of threshold values in solar wind speed, proton density and interplanetary magnetic field (IMF) Bz that can potentially be useful to predict sudden increases in flux. Persistent depletion events are characterized by a low solar wind speed, a sudden increase in proton density that remains elevated for a few days, and a northward turning of IMF Bz shortly after the depletion starts. We have also found that all relativistic electron enhancement or persistent depletion events occur when some geomagnetic disturbance is present, either a coronal mass ejection or a corotational interaction region; however, the storm index, SYM-H, does not show a strong connection with relativistic electron enhancement events or persistent depletion events. We have tested a simple threshold method for predictability of relativistic electron enhancement events using data from GOES 11 for the years 2007-2010 and found that around 90% of large increases in electron fluxes can be identified with this method.
CoinCalc-A new R package for quantifying simultaneities of event series
NASA Astrophysics Data System (ADS)
Siegmund, Jonatan F.; Siegmund, Nicole; Donner, Reik V.
2017-01-01
We present the new R package CoinCalc for performing event coincidence analysis (ECA), a novel statistical method to quantify the simultaneity of events contained in two series of observations, either as simultaneous or lagged coincidences within a user-specific temporal tolerance window. The package also provides different analytical as well as surrogate-based significance tests (valid under different assumptions about the nature of the observed event series) as well as an intuitive visualization of the identified coincidences. We demonstrate the usage of CoinCalc based on two typical geoscientific example problems addressing the relationship between meteorological extremes and plant phenology as well as that between soil properties and land cover.
NASA Astrophysics Data System (ADS)
Che, Il-Young; Jeon, Jeong-Soo
2010-05-01
Korea Institute of Geoscience and Mineral Resources (KIGAM) operates an infrasound network consisting of seven seismo-acoustic arrays in South Korea. Development of the arrays began in 1999, partially in collaboration with Southern Methodist University, with the goal of detecting distant infrasound signals from natural and anthropogenic phenomena in and around the Korean Peninsula. The main operational purpose of this network is to discriminate man-made seismic events from seismicity including thousands of seismic events per year in the region. The man-made seismic events are major cause of error in estimating the natural seismicity, especially where the seismic activity is weak or moderate such as in the Korean Peninsula. In order to discriminate the man-made explosions from earthquakes, we have applied the seismo-acoustic analysis associating seismic and infrasonic signals generated from surface explosion. The observations of infrasound at multiple arrays made it possible to discriminate surface explosion, because small or moderate size earthquake is not sufficient to generate infrasound. Till now we have annually discriminated hundreds of seismic events in seismological catalog as surface explosions by the seismo-acoustic analysis. Besides of the surface explosions, the network also detected infrasound signals from other sources, such as bolide, typhoons, rocket launches, and underground nuclear test occurred in and around the Korean Peninsula. In this study, ten years of seismo-acoustic data are reviewed with recent infrasonic detection algorithm and association method that finally linked to the seismic monitoring system of the KIGAM to increase the detection rate of surface explosions. We present the long-term results of seismo-acoustic analysis, the detection capability of the multiple arrays, and implications for seismic source location. Since the seismo-acoustic analysis is proved as a definite method to discriminate surface explosion, the analysis will be continuously used for estimating natural seismicity and understanding infrasonic sources.
Performance analysis of landslide early warning systems at regional scale: the EDuMaP method
NASA Astrophysics Data System (ADS)
Piciullo, Luca; Calvello, Michele
2016-04-01
Landslide early warning systems (LEWSs) reduce landslide risk by disseminating timely and meaningful warnings when the level of risk is judged intolerably high. Two categories of LEWSs, can be defined on the basis of their scale of analysis: "local" systems and "regional" systems. LEWSs at regional scale (ReLEWSs) are used to assess the probability of occurrence of landslides over appropriately-defined homogeneous warning zones of relevant extension, typically through the prediction and monitoring of meteorological variables, in order to give generalized warnings to the public. Despite many studies on ReLEWSs, no standard requirements exist for assessing their performance. Empirical evaluations are often carried out by simply analysing the time frames during which significant high-consequence landslides occurred in the test area. Alternatively, the performance evaluation is based on 2x2 contingency tables computed for the joint frequency distribution of landslides and alerts, both considered as dichotomous variables. In all these cases, model performance is assessed neglecting some important aspects which are peculiar to ReLEWSs, among which: the possible occurrence of multiple landslides in the warning zone; the duration of the warnings in relation to the time of occurrence of the landslides; the level of the warning issued in relation to the landslide spatial density in the warning zone; the relative importance system managers attribute to different types of errors. An original approach, called EDuMaP method, is proposed to assess the performance of landslide early warning models operating at regional scale. The method is composed by three main phases: Events analysis, Duration Matrix, Performance analysis. The events analysis phase focuses on the definition of landslide (LEs) and warning events (WEs), which are derived from available landslides and warnings databases according to their spatial and temporal characteristics by means of ten input parameters. The evaluation of time associated with the occurrence of landslide events (LE) in relation to the occurrence of warning events (WE) in their respective classes is a fundamental step to determine the duration matrix elements. On the other hand the classification of LEs and WEs establishes the structure of the duration matrix. Indeed, the number of rows and columns of the matrix is equal to the number of classes defined for the warning and landslide events, respectively. Thus the matrix is not expressed as a 2x2 contingency and LEs and WEs are not expressed as dichotomous variables. The final phase of the method is the evaluation of the duration matrix based on a set of performance criteria assigning a performance meaning to the element of the matrix. To this aim different criteria can be defined, for instance employing an alert classification scheme derived from 2x2 contingency tables or assigning a colour code to the elements of the matrix in relation to their grade of correctness. Finally, performance indicators can be derived from the performance criteria to quantify successes and errors of the early warning models. EDuMaP has been already applied to different real case studies, highlighting the adaptability of the method to analyse the performance of structurally different ReLEWSs.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-04-01
The SplitLab package (Wüstefeld et al., Computers and Geosciences, 2008), written in MATLAB, is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to seaside or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure.
NASA Astrophysics Data System (ADS)
Moloshnikov, I. A.; Sboev, A. G.; Rybka, R. B.; Gydovskikh, D. V.
2016-02-01
The composite algorithm integrating, on one hand, the algorithm of finding documents on a given topic, and, on the other hand, the method of emotiveness evaluation of topical texts is presented. This method is convenient for analysis of people opinions expressed in social media and, as a result, for automated analysis of event evolutions in social media. Some examples of such analysing are demonstrated and discussed.
NASA Astrophysics Data System (ADS)
Uchide, T.; Shearer, P. M.
2009-12-01
Introduction Uchide and Ide [SSA Spring Meeting, 2009] proposed a new framework for studying the scaling and overall nature of earthquake rupture growth in terms of cumulative moment functions. For better understanding of rupture growth processes, spatiotemporally local processes are also important. The nature of high-frequency (HF) radiation has been investigated for some time, but its role in the earthquake rupture process is still unclear. A wavelet analysis reveals that the HF radiation (e.g., 4 - 32 Hz) of the 2004 Parkfield earthquake is peaky, which implies that the sources of the HF radiation are isolated in space and time. We experiment with applying a matched filter analysis using small template events occurring near the target event rupture area to test whether it can reveal the HF radiation sources for a regular large earthquake. Method We design a matched filter for multiple components and stations. Shelly et al. [2007] attempted identifying low-frequency earthquakes (LFE) in non-volcanic tremor waveforms by stacking the correlation coefficients (CC) between the seismograms of the tremor and the LFE. Differing from their method, our event detection indicator is the CC between the seismograms of the target and template events recorded at the same stations, since the key information for detecting the sources will be the arrival-time differences and the amplitude ratios among stations. Data from both the target and template events are normalized by the maximum amplitude of the seismogram of the template event in the cross-correlation time window. This process accounts for the radiation pattern and distance between the source and stations. At each small earthquake target, high values in the CC time series suggest the possibility of HF radiation during the mainshock rupture from a similar location to the target event. Application to the 2004 Parkfield earthquake We apply the matched filter method to the 2004 Parkfield earthquake (Mw 6.0). We use seismograms recorded at the 13 stations of UPSAR [Fletcher et al, 1992]. At each station, both acceleration and velocity sensors are installed, therefore both large and small earthquakes are observable. We employ 184 earthquakes (M 2.0 - 3.5) as template events, and 0.5 s of the P waves on the vertical components and the S waves on all three components. The data are bandpass-filtered between 4 and 16 Hz. One source is detected at 4 s and 12 km northwest from the hypocenter. Although the CC has generally low values, its peak is more than five times larger than its standard deviation and thus remarkably high. This source is close to the secondary onset revealed by a back-projection analysis of 2 - 8 Hz data from Parkfield strong motion stations [Allmann and Shearer, 2007]. While the back-projection approach images the peak of HF radiation, our method detects the onset time, which is slightly different. Another source is located at 1.2 s and 2 km southeast from the hypocenter, which may correspond to deceleration of the initial rupture. Comparisons of the derived HF radiation sources to the whole rupture process will help us reveal general earthquake source dynamics.
Computer-Aided Design Of Turbine Blades And Vanes
NASA Technical Reports Server (NTRS)
Hsu, Wayne Q.
1988-01-01
Quasi-three-dimensional method for determining aerothermodynamic configuration of turbine uses computer-interactive analysis and design and computer-interactive graphics. Design procedure executed rapidly so designer easily repeats it to arrive at best performance, size, structural integrity, and engine life. Sequence of events in aerothermodynamic analysis and design starts with engine-balance equations and ends with boundary-layer analysis and viscous-flow calculations. Analysis-and-design procedure interactive and iterative throughout.
Rejection of randomly coinciding events in ZnMoO scintillating bolometers
NASA Astrophysics Data System (ADS)
Chernyak, D. M.; Danevich, F. A.; Giuliani, A.; Mancuso, M.; Nones, C.; Olivieri, E.; Tenconi, M.; Tretyak, V. I.
2014-06-01
Random coincidence of events (particularly from two neutrino double beta decay) could be one of the main sources of background in the search for neutrinoless double beta decay with cryogenic bolometers due to their poor time resolution. Pulse-shape discrimination by using front edge analysis, mean-time and methods were applied to discriminate randomly coinciding events in ZnMoO cryogenic scintillating bolometers. These events can be effectively rejected at the level of 99 % by the analysis of the heat signals with rise-time of about 14 ms and signal-to-noise ratio of 900, and at the level of 92 % by the analysis of the light signals with rise-time of about 3 ms and signal-to-noise ratio of 30, under the requirement to detect 95 % of single events. These rejection efficiencies are compatible with extremely low background levels in the region of interest of neutrinoless double beta decay of Mo for enriched ZnMoO detectors, of the order of counts/(y keV kg). Pulse-shape parameters have been chosen on the basis of the performance of a real massive ZnMoO scintillating bolometer. Importance of the signal-to-noise ratio, correct finding of the signal start and choice of an appropriate sampling frequency are discussed.
Monogamy on the Street: A Mixed Methods Study of Homeless Men
ERIC Educational Resources Information Center
Brown, Ryan A.; Kennedy, David P.; Tucker, Joan S.; Golinelli, Daniela; Wenzel, Suzanne L.
2013-01-01
In this study, we used a mixed methods approach to explore the determinants of relationship patterns and risky sex among homeless men living in downtown Los Angeles. This involved analysis of qualitative interviews focused on gender ideology and sexual events ("n" = 30) as well as structured interviews ("n" = 305) focused on…
Kapucu, Fikret E.; Välkki, Inkeri; Mikkonen, Jarno E.; Leone, Chiara; Lenk, Kerstin; Tanskanen, Jarno M. A.; Hyttinen, Jari A. K.
2016-01-01
Synchrony and asynchrony are essential aspects of the functioning of interconnected neuronal cells and networks. New information on neuronal synchronization can be expected to aid in understanding these systems. Synchronization provides insight in the functional connectivity and the spatial distribution of the information processing in the networks. Synchronization is generally studied with time domain analysis of neuronal events, or using direct frequency spectrum analysis, e.g., in specific frequency bands. However, these methods have their pitfalls. Thus, we have previously proposed a method to analyze temporal changes in the complexity of the frequency of signals originating from different network regions. The method is based on the correlation of time varying spectral entropies (SEs). SE assesses the regularity, or complexity, of a time series by quantifying the uniformity of the frequency spectrum distribution. It has been previously employed, e.g., in electroencephalogram analysis. Here, we revisit our correlated spectral entropy method (CorSE), providing evidence of its justification, usability, and benefits. Here, CorSE is assessed with simulations and in vitro microelectrode array (MEA) data. CorSE is first demonstrated with a specifically tailored toy simulation to illustrate how it can identify synchronized populations. To provide a form of validation, the method was tested with simulated data from integrate-and-fire model based computational neuronal networks. To demonstrate the analysis of real data, CorSE was applied on in vitro MEA data measured from rat cortical cell cultures, and the results were compared with three known event based synchronization measures. Finally, we show the usability by tracking the development of networks in dissociated mouse cortical cell cultures. The results show that temporal correlations in frequency spectrum distributions reflect the network relations of neuronal populations. In the simulated data, CorSE unraveled the synchronizations. With the real in vitro MEA data, CorSE produced biologically plausible results. Since CorSE analyses continuous data, it is not affected by possibly poor spike or other event detection quality. We conclude that CorSE can reveal neuronal network synchronization based on in vitro MEA field potential measurements. CorSE is expected to be equally applicable also in the analysis of corresponding in vivo and ex vivo data analysis. PMID:27803660
A data skimming service for locally resident analysis data
NASA Astrophysics Data System (ADS)
Cranshaw, J.; Gardner, R. W.; Gieraltowski, J.; Malon, D.; Mambelli, M.; May, E.
2008-07-01
A Data Skimming Service (DSS) is a site-level service for rapid event filtering and selection from locally resident datasets based on metadata queries to associated 'tag' databases. In US ATLAS, we expect most if not all of the AOD-based datasets to be replicated to each of the five Tier 2 regional facilities in the US Tier 1 'cloud' coordinated by Brookhaven National Laboratory. Entire datasets will consist of on the order of several terabytes of data, and providing easy, quick access to skimmed subsets of these data will be vital to physics working groups. Typically, physicists will be interested in portions of the complete datasets, selected according to event-level attributes (number of jets, missing Et, etc) and content (specific analysis objects for subsequent processing). In this paper we describe methods used to classify data (metadata tag generation) and to store these results in a local database. Next we discuss a general framework which includes methods for accessing this information, defining skims, specifying event output content, accessing locally available storage through a variety of interfaces (SRM, dCache/dccp, gridftp), accessing remote storage elements as specified, and user job submission tools through local or grid schedulers. The advantages of the DSS are the ability to quickly 'browse' datasets and design skims, for example, pre-adjusting cuts to get to a desired skim level with minimal use of compute resources, and to encode these analysis operations in a database for re-analysis and archival purposes. Additionally the framework has provisions to operate autonomously in the event that external, central resources are not available, and to provide, as a reduced package, a minimal skimming service tailored to the needs of small Tier 3 centres or individual users.
Atighechian, Golrokh; Maleki, Mohammadreza; Aryankhesal, Aidin; Jahangiri, Katayoun
2016-01-01
Introduction: Oil spill in fresh water can affect ecological processes and accordingly it can influence human health. Iran, due to having 58.8 % of the world oil reserves, is highly vulnerable to water contamination by oil products. Aim: The aim of this study was to determine environmental factors affecting the management of the oil spill into one of the river in Iran using the PESTLE analysis. Material and methods: This was a qualitative case study conducted in 2015 on an oil spill incident in Iran and its roots from a disaster management approach. Semi-structured interviews were conducted for data collection. Seventy managers and staffs with those responsible or involved in oil spill incident management were recruited to the study. Qualitative content analysis approach was employed for the data analysis. Document analysis was used to collect additional information. Results: Findings of the present study indicated that different factors affected the management of the event of oil spill onto one of the central river and consequently the management of drink water resources. Using this analysis, managers can plan for such events and develop scenarios for them to have better performance for the future events. PMID:27698608
Space Frontiers for New Pedagogies: A Tale of Constraints and Possibilities
ERIC Educational Resources Information Center
Jessop, Tansy; Gubby, Laura; Smith, Angela
2012-01-01
This article draws together two linked studies on formal teaching spaces within one university. The first consisted of a multi-method analysis, including observations of four teaching events, interviews with academics and estates staff, analysis of architectural plans, and a talking campus tour. The second study surveyed 166 students about their…
NASA Astrophysics Data System (ADS)
Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.
2016-12-01
A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.
Logistic regression applied to natural hazards: rare event logistic regression with replications
NASA Astrophysics Data System (ADS)
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Hoffmann, Susanne; Frei, Irena Anna
2017-01-01
Background: Analysing adverse events is an effective patient safety measure. Aim: We show, how clinical nurse specialists have been enabled to analyse adverse events with the „Learning from Defects-Tool“ (LFD-Tool). Method: Our multi-component implementation strategy addressed both, the safety knowledge of clinical nurse specialists and their attitude towards patient safety. The culture of practice development was taken into account. Results: Clinical nurse specialists relate competency building on patient safety due to the application of the LFD-tool. Applying the tool, fosters the reflection of adverse events in care teams. Conclusion: Applying the „Learning from Defects-Tool“ promotes work-based learning. Analysing adverse events with the „Learning from Defects-Tool“ contributes to the safety culture in a hospital.
Pealing, Louise; Perel, Pablo; Prieto-Merino, David; Roberts, Ian
2012-01-01
Background Vascular occlusive events can complicate recovery following trauma. We examined risk factors for venous and arterial vascular occlusive events in trauma patients and the extent to which the risk of vascular occlusive events varies with the severity of bleeding. Methods and Findings We conducted a cohort analysis using data from a large international, double-blind, randomised, placebo-controlled trial (The CRASH-2 trial) [1]. We studied the association between patient demographic and physiological parameters at hospital admission and the risk of vascular occlusive events. To assess the extent to which risk of vascular occlusive events varies with severity of bleeding, we constructed a prognostic model for the risk of death due to bleeding and assessed the relationship between risk of death due to bleeding and risk of vascular occlusive events. There were 20,127 trauma patients with outcome data including 204 (1.01%) patients with a venous event (pulmonary embolism or deep vein thrombosis) and 200 (0.99%) with an arterial event (myocardial infarction or stroke). There were 81 deaths due to vascular occlusive events. Increasing age, decreasing systolic blood pressure, increased respiratory rates, longer central capillary refill times, higher heart rates and lower Glasgow Coma Scores (all p<0.02) were strong risk factors for venous and arterial vascular occlusive events. Patients with more severe bleeding as assessed by predicted risk of haemorrhage death had a greatly increased risk for all types of vascular occlusive event (all p<0.001). Conclusions Patients with severe traumatic bleeding are at greatly increased risk of venous and arterial vascular occlusive events. Older age and blunt trauma are also risk factors for vascular occlusive events. Effective treatment of bleeding may reduce venous and arterial vascular occlusive complications in trauma patients. PMID:23251374
Zhou, Bing-Yang; Guo, Yuan-Lin; Wu, Na-Qiong; Zhu, Cheng-Gang; Gao, Ying; Qing, Ping; Li, Xiao-Lin; Wang, Yao; Dong, Qian; Liu, Geng; Xu, Rui Xia; Cui, Chuan-Jue; Sun, Jing; Li, Jian-Jun
2017-03-01
Big endothelin-1 (ET-1) has been proposed as a novel prognostic indicator of acute coronary syndrome, while its predicting role of cardiovascular outcomes in patients with stable coronary artery disease (CAD) is unclear. A total of 3154 consecutive patients with stable CAD were enrolled and followed up for 24months. The outcomes included all-cause death, non-fatal myocardial infarction, stroke and unplanned revascularization (percutaneous coronary intervention and coronary artery bypass grafting). Baseline big ET-1 was measured using sandwich enzyme immunoassay method. Cox proportional hazard regression analysis and Kaplan-Meier analysis were used to evaluate the prognostic value of big ET-1 on cardiovascular outcomes. One hundred and eighty-nine (5.99%) events occurred during follow-up. Patients were divided into two groups: events group (n=189) and non-events group (n=2965). The results indicated that the events group had higher levels of big ET-1 compared to non-events group. Multivariable Cox proportional hazard regression analysis showed that big ET-1 was positively and statistically correlated with clinical outcomes (Hazard Ratio: 1.656, 95% confidence interval: 1.099-2.496, p=0.016). Additionally, the Kaplan-Meier analysis revealed that patients with higher big ET-1 presented lower event-free survival (p=0.016). The present study firstly suggests that big ET-1 is an independent risk marker of cardiovascular outcomes in patients with stable CAD. And more studies are needed to confirm our findings. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Avalanche Statistics Identify Intrinsic Stellar Processes near Criticality in KIC 8462852
NASA Astrophysics Data System (ADS)
Sheikh, Mohammed A.; Weaver, Richard L.; Dahmen, Karin A.
2016-12-01
The star KIC8462852 (Tabby's star) has shown anomalous drops in light flux. We perform a statistical analysis of the more numerous smaller dimming events by using methods found useful for avalanches in ferromagnetism and plastic flow. Scaling exponents for avalanche statistics and temporal profiles of the flux during the dimming events are close to mean field predictions. Scaling collapses suggest that this star may be near a nonequilibrium critical point. The large events are interpreted as avalanches marked by modified dynamics, limited by the system size, and not within the scaling regime.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
Improved pulse shape discrimination in EJ-301 liquid scintillators
NASA Astrophysics Data System (ADS)
Lang, R. F.; Masson, D.; Pienaar, J.; Röttger, S.
2017-06-01
Digital pulse shape discrimination has become readily available to distinguish nuclear recoil and electronic recoil events in scintillation detectors. We evaluate digital implementations of pulse shape discrimination algorithms discussed in the literature, namely the Charge Comparison Method, Pulse-Gradient Analysis, Fourier Series and Standard Event Fitting. In addition, we present a novel algorithm based on a Laplace Transform. Instead of comparing the performance of these algorithms based on a single Figure of Merit, we evaluate them as a function of recoil energy. Specifically, using commercial EJ-301 liquid scintillators, we examined both the resulting acceptance of nuclear recoils at a given rejection level of electronic recoils, as well as the purity of the selected nuclear recoil event samples. We find that both a Standard Event fit and a Laplace Transform can be used to significantly improve the discrimination capabilities over the whole considered energy range of 0 - 800keVee . Furthermore, we show that the Charge Comparison Method performs poorly in accurately identifying nuclear recoils.
NASA Astrophysics Data System (ADS)
Huamán Bustamante, Samuel G.; Cavalcanti Pacheco, Marco A.; Lazo Lazo, Juan G.
2018-07-01
The method we propose in this paper seeks to estimate interface displacements among strata related with reflection seismic events, in comparison to the interfaces at other reference points. To do so, we search for reflection events in the reference point of a second seismic trace taken from the same 3D survey and close to a well. However, the nature of the seismic data introduces uncertainty in the results. Therefore, we perform an uncertainty analysis using the standard deviation results from several experiments with cross-correlation of signals. To estimate the displacements of events in depth between two seismic traces, we create a synthetic seismic trace with an empirical wavelet and the sonic log of the well, close to the second seismic trace. Then, we relate the events of the seismic traces to the depth of the sonic log. Finally, we test the method with data from the Namorado Field in Brazil. The results show that the accuracy of the event estimated depth depends on the results of parallel cross-correlation, primarily those from the procedures used in the integration of seismic data with data from the well. The proposed approach can correctly identify several similar events in two seismic traces without requiring all seismic traces between two distant points of interest to correlate strata in the subsurface.
An Instructor's Diagnostic Aid for Feedback in Training.
ERIC Educational Resources Information Center
Andrews, Dee H.; Uliano, Kevin C.
1988-01-01
Instructor's Diagnostic Aid for Feedback in Training (IDAFT) is a computer-assisted method based on error analysis, domains of learning, and events of instruction. Its use with Navy team instructors is currently being explored. (JOW)
Prejudice Events and Traumatic Stress among Heterosexuals and Lesbians, Gay Men and Bisexuals
Alessi, Edward J.; Martin, James I.; Gyamerah, Akua; Meyer, Ilan H.
2013-01-01
This mixed-methods study examined associations between prejudice events and posttraumatic stress disorder (PTSD) among 382 lesbians, gays, and bisexuals (LGB) and 126 heterosexuals. Using the Composite International Diagnostic Interview, we assessed PTSD with a relaxed Criterion A1; that is, we allowed events that did not involve threat to life or physical integrity to also qualify as traumatic. We first assessed whether exposure to prejudice-related qualifying events differed with respect to participants’ sexual orientation and race. We found that White LGBs were more likely than White heterosexuals to encounter a prejudice-related qualifying event, and among LGBs, Black and Latino LGBs were no more likely than White LGBs to experience this type of event. We then used qualitative analysis of participants’ brief narratives to examine prejudice events that precipitated Relaxed Criterion A1 PTSD among 8 participants. Two themes emerged: (a) the need to make major changes and (b) compromised sense of safety and security following exposure to the prejudice event. PMID:24415898
Rejection of randomly coinciding 2ν2β events in ZnMoO4 scintillating bolometers
NASA Astrophysics Data System (ADS)
Chernyak, D. M.; Danevich, F. A.; Giuliani, A.; Mancuso, M.; Nones, C.; Olivieri, E.; Tenconi, M.; Tretyak, V. I.
2014-01-01
Random coincidence of 2ν2β decay events could be one of the main sources of background for 0ν2β decay in cryogenic bolometers due to their poor time resolution. Pulse-shape discrimination by using front edge analysis, the mean-time and χ2 methods was applied to discriminate randomly coinciding 2ν2β events in ZnMoO4 cryogenic scintillating bolometers. The background can be effectively rejected on the level of 99% by the mean-time analysis of heat signals with the rise time about 14 ms and the signal-to-noise ratio 900, and on the level of 98% for the light signals with 3 ms rise time and signal-to-noise ratio of 30 (under a requirement to detect 95% of single events). Importance of the signal-to-noise ratio, correct finding of the signal start and choice of an appropriate sampling frequency are discussed.
High-speed extended-term time-domain simulation for online cascading analysis of power system
NASA Astrophysics Data System (ADS)
Fu, Chuan
A high-speed extended-term (HSET) time domain simulator (TDS), intended to become a part of an energy management system (EMS), has been newly developed for use in online extended-term dynamic cascading analysis of power systems. HSET-TDS includes the following attributes for providing situational awareness of high-consequence events: (i) online analysis, including n-1 and n-k events, (ii) ability to simulate both fast and slow dynamics for 1-3 hours in advance, (iii) inclusion of rigorous protection-system modeling, (iv) intelligence for corrective action ID, storage, and fast retrieval, and (v) high-speed execution. Very fast on-line computational capability is the most desired attribute of this simulator. Based on the process of solving algebraic differential equations describing the dynamics of power system, HSET-TDS seeks to develop computational efficiency at each of the following hierarchical levels, (i) hardware, (ii) strategies, (iii) integration methods, (iv) nonlinear solvers, and (v) linear solver libraries. This thesis first describes the Hammer-Hollingsworth 4 (HH4) implicit integration method. Like the trapezoidal rule, HH4 is symmetrically A-Stable but it possesses greater high-order precision (h4 ) than the trapezoidal rule. Such precision enables larger integration steps and therefore improves simulation efficiency for variable step size implementations. This thesis provides the underlying theory on which we advocate use of HH4 over other numerical integration methods for power system time-domain simulation. Second, motivated by the need to perform high speed extended-term time domain simulation (HSET-TDS) for on-line purposes, this thesis presents principles for designing numerical solvers of differential algebraic systems associated with power system time-domain simulation, including DAE construction strategies (Direct Solution Method), integration methods(HH4), nonlinear solvers(Very Dishonest Newton), and linear solvers(SuperLU). We have implemented a design appropriate for HSET-TDS, and we compare it to various solvers, including the commercial grade PSSE program, with respect to computational efficiency and accuracy, using as examples the New England 39 bus system, the expanded 8775 bus system, and PJM 13029 buses system. Third, we have explored a stiffness-decoupling method, intended to be part of parallel design of time domain simulation software for super computers. The stiffness-decoupling method is able to combine the advantages of implicit methods (A-stability) and explicit method(less computation). With the new stiffness detection method proposed herein, the stiffness can be captured. The expanded 975 buses system is used to test simulation efficiency. Finally, several parallel strategies for super computer deployment to simulate power system dynamics are proposed and compared. Design A partitions the task via scale with the stiffness decoupling method, waveform relaxation, and parallel linear solver. Design B partitions the task via the time axis using a highly precise integration method, the Kuntzmann-Butcher Method - order 8 (KB8). The strategy of partitioning events is designed to partition the whole simulation via the time axis through a simulated sequence of cascading events. For all strategies proposed, a strategy of partitioning cascading events is recommended, since the sub-tasks for each processor are totally independent, and therefore minimum communication time is needed.
Analysis of labour accidents in tunnel construction and introduction of prevention measures
KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.
2015-01-01
At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707
Analysis of labour accidents in tunnel construction and introduction of prevention measures.
Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P
2015-01-01
At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events.
MIRAP, microcomputer reliability analysis program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jehee, J.N.T.
1989-01-01
A program for a microcomputer is outlined that can determine minimal cut sets from a specified fault tree logic. The speed and memory limitations of the microcomputers on which the program is implemented (Atari ST and IBM) are addressed by reducing the fault tree's size and by storing the cut set data on disk. Extensive well proven fault tree restructuring techniques, such as the identification of sibling events and of independent gate events, reduces the fault tree's size but does not alter its logic. New methods are used for the Boolean reduction of the fault tree logic. Special criteria formore » combining events in the 'AND' and 'OR' logic avoid the creation of many subsuming cut sets which all would cancel out due to existing cut sets. Figures and tables illustrates these methods. 4 refs., 5 tabs.« less
Statistical analysis of experimental multifragmentation events in 64Zn+112Sn at 40 MeV/nucleon
NASA Astrophysics Data System (ADS)
Lin, W.; Zheng, H.; Ren, P.; Liu, X.; Huang, M.; Wada, R.; Chen, Z.; Wang, J.; Xiao, G. Q.; Qu, G.
2018-04-01
A statistical multifragmentation model (SMM) is applied to the experimentally observed multifragmentation events in an intermediate heavy-ion reaction. Using the temperature and symmetry energy extracted from the isobaric yield ratio (IYR) method based on the modified Fisher model (MFM), SMM is applied to the reaction 64Zn+112Sn at 40 MeV/nucleon. The experimental isotope distribution and mass distribution of the primary reconstructed fragments are compared without afterburner and they are well reproduced. The extracted temperature T and symmetry energy coefficient asym from SMM simulated events, using the IYR method, are also consistent with those from the experiment. These results strongly suggest that in the multifragmentation process there is a freezeout volume, in which the thermal and chemical equilibrium is established before or at the time of the intermediate-mass fragments emission.
Vossen, Catherine J.; Vossen, Helen G. M.; Marcus, Marco A. E.; van Os, Jim; Lousberg, Richel
2013-01-01
In analyzing time-locked event-related potentials (ERPs), many studies have focused on specific peaks and their differences between experimental conditions. In theory, each latency point after a stimulus contains potentially meaningful information, regardless of whether it is peak-related. Based on this assumption, we introduce a new concept which allows for flexible investigation of the whole epoch and does not primarily focus on peaks and their corresponding latencies. For each trial, the entire epoch is partitioned into event-related fixed-interval areas under the curve (ERFIAs). These ERFIAs, obtained at single trial level, act as dependent variables in a multilevel random regression analysis. The ERFIA multilevel method was tested in an existing ERP dataset of 85 healthy subjects, who underwent a rating paradigm of 150 painful and non-painful somatosensory electrical stimuli. We modeled the variability of each consecutive ERFIA with a set of predictor variables among which were stimulus intensity and stimulus number. Furthermore, we corrected for latency variations of the P2 (260 ms). With respect to known relationships between stimulus intensity, habituation, and pain-related somatosensory ERP, the ERFIA method generated highly comparable results to those of commonly used methods. Notably, effects on stimulus intensity and habituation were also observed in non-peak-related latency ranges. Further, cortical processing of actual stimulus intensity depended on the intensity of the previous stimulus, which may reflect pain-memory processing. In conclusion, the ERFIA multilevel method is a promising tool that can be used to study event-related cortical processing. PMID:24224018
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Malicki, Julian; Bly, Ritva; Bulot, Mireille; Godet, Jean-Luc; Jahnen, Andreas; Krengli, Marco; Maingon, Philippe; Prieto Martin, Carlos; Przybylska, Kamila; Skrobała, Agnieszka; Valero, Marc; Jarvinen, Hannu
2017-04-01
To describe the current status of implementation of European directives for risk management in radiotherapy and to assess variability in risk management in the following areas: 1) in-country regulatory framework; 2) proactive risk assessment; (3) reactive analysis of events; and (4) reporting and learning systems. The original data were collected as part of the ACCIRAD project through two online surveys. Risk assessment criteria are closely associated with quality assurance programs. Only 9/32 responding countries (28%) with national regulations reported clear "requirements" for proactive risk assessment and/or reactive risk analysis, with wide variability in assessment methods. Reporting of adverse error events is mandatory in most (70%) but not all surveyed countries. Most European countries have taken steps to implement European directives designed to reduce the probability and magnitude of accidents in radiotherapy. Variability between countries is substantial in terms of legal frameworks, tools used to conduct proactive risk assessment and reactive analysis of events, and in the reporting and learning systems utilized. These findings underscore the need for greater harmonisation in common terminology, classification and reporting practices across Europe to improve patient safety and to enable more reliable inter-country comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
Reproducing an extreme flood with uncertain post-event information
NASA Astrophysics Data System (ADS)
Fuentes-Andino, Diana; Beven, Keith; Halldin, Sven; Xu, Chong-Yu; Reynolds, José Eduardo; Di Baldassarre, Giuliano
2017-07-01
Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum-Cunge-Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE) uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events can be added into the analysis as they become available.
NASA Astrophysics Data System (ADS)
Li, ZhaoYu; Chen, Tao; Yan, GuangQing
2016-10-01
A new method for determining the central axial orientation of a two-dimensional coherent magnetic flux rope (MFR) via multipoint analysis of the magnetic-field structure is developed. The method is devised under the following geometrical assumptions: (1) on its cross section, the structure is left-right symmetric; (2) the projected structure velocity is vertical to the line of symmetry. The two conditions can be naturally satisfied for cylindrical MFRs and are expected to be satisfied for MFRs that are flattened within current sheets. The model test demonstrates that, for determining the axial orientation of such structures, the new method is more efficient and reliable than traditional techniques such as minimum-variance analysis of the magnetic field, Grad-Shafranov (GS) reconstruction, and the more recent method based on the cylindrically symmetric assumption. A total of five flux transfer events observed by Cluster are studied using the proposed approach, and the application results indicate that the observed structures, regardless of their actual physical properties, fit the assumed geometrical model well. For these events, the inferred axial orientations are all in excellent agreement with those obtained using the multi-GS reconstruction technique.
An Improved Method for Seismic Event Depth and Moment Tensor Determination: CTBT Related Application
NASA Astrophysics Data System (ADS)
Stachnik, J.; Rozhkov, M.; Baker, B.
2016-12-01
According to the Protocol to CTBT, International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event. Determination of seismic event source mechanism and its depth is a part of these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. We show preliminary results using the latter approach from an improved software design and applied on a moderately powered computer. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK 2009, 2013 and 2016 events and shallow earthquakes using a new implementation of waveform fitting of teleseismic P waves. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Moment tensors for DPRK events show isotropic percentages greater than 50%. Depth estimates for the DPRK events range from 1.0-1.4 km. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.
Partial and no recovery from delirium after hospital discharge predict increased adverse events.
Cole, Martin G; McCusker, Jane; Bailey, Robert; Bonnycastle, Michael; Fung, Shek; Ciampi, Antonio; Belzile, Eric
2017-01-08
The implications of partial and no recovery from delirium after hospital discharge are not clear. We sought to explore whether partial and no recovery from delirium among recently discharged patients predicted increased adverse events (emergency room visits, hospitalisations, death) during the subsequent 3 months. Prospective study of recovery from delirium in older hospital inpatients. The Confusion Assessment Method was used to diagnose delirium in hospital and determine recovery status after discharge (T0). Adverse events were determined during the 3 months T0. Survival analysis to the first adverse event and counting process modelling for one or more adverse events were used to examine associations between recovery status (ordinal variable, 0, 1 or 2 for full, partial or no recovery, respectively) and adverse events. Of 278 hospital inpatients with delirium, 172 were discharged before the assessment of recovery status (T0). Delirium recovery status at T0 was determined for 152: 25 had full recovery, 32 had partial recovery and 95 had no recovery. Forty-four patients had at least one adverse event during the subsequent 3 months. In multivariable analysis of one or more adverse events, poorer recovery status predicted increased adverse events; the hazard ratio (HR) (95% confidence interval, CI) was 1.72 (1.09, 2.71). The association of recovery status with adverse events was stronger among patients without dementia. Partial and no recovery from delirium after hospital discharge appear to predict increased adverse events during the subsequent 3 months These findings have potentially important implications for in-hospital and post-discharge management and policy.
Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2017-05-01
The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovsky, Zachary Kyle; Denman, Matthew R.
It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less
NASA Astrophysics Data System (ADS)
Forte, F.; Strobl, R. O.; Pennetta, L.
2006-07-01
The impact of calamitous meteoric events and their interaction with the geological and geomorphological environment represent a current problem of the Supersano-Ruffano-Nociglia Graben in southern Italy. Indeed, severe floods take place on a frequent basis not only in autumn and winter, but in summer also. These calamities are not only triggered by exceptional events, but are also amplified by peculiar geological and morpho-structural characteristics of the Graben. Flooding often affects vast agricultural areas and consequently, water-scooping machines cannot remove the rainwater. These events cause warnings and emergency states, involving people as well as socio economic goods. This study represents an application of a vanguard technique for loss estimation and flood vulnerability analysis, integrating a geographic information system (GIS) with aerial photos and remote sensing methods. The analysis results clearly show that the Graben area is potentially at greatest flood vulnerability, while along the Horsts the flood vulnerability is lower.
van Roozendaal, Lori M.; Strobbe, Luc J. A.; Aebi, Stefan; Cameron, David A.; Dixon, J. Michael; Giuliano, Armando E.; Haffty, Bruce G.; Hickey, Brigid E.; Hudis, Clifford A.; Klimberg, V. Suzanne; Koczwara, Bogda; Kühn, Thorsten; Lippman, Marc E.; Lucci, Anthony; Piccart, Martine; Smith, Benjamin D.; Tjan-Heijnen, Vivianne C. G.; van de Velde, Cornelis J. H.; Van Zee, Kimberly J.; Vermorken, Jan B.; Viale, Giuseppe; Voogd, Adri C.; Wapnir, Irene L.; White, Julia R.; Smidt, Marjolein L.
2014-01-01
Background In breast cancer studies, many different endpoints are used. Definitions are often not provided or vary between studies. For instance, “local recurrence” may include different components in similar studies. This limits transparency and comparability of results. This project aimed to reach consensus on the definitions of local event, second primary breast cancer, regional and distant event for breast cancer studies. Methods The RAND-UCLA Appropriateness method (modified Delphi method) was used. A Consensus Group of international breast cancer experts was formed, including representatives of all involved clinical disciplines. Consensus was reached in two rounds of online questionnaires and one meeting. Results Twenty-four international breast cancer experts participated. Consensus was reached on 134 items in four categories. Local event is defined as any epithelial breast cancer or ductal carcinoma in situ (DCIS) in the ipsilateral breast, or skin and subcutaneous tissue on the ipsilateral thoracic wall. Second primary breast cancer is defined as epithelial breast cancer in the contralateral breast. Regional events are breast cancer in ipsilateral lymph nodes. A distant event is breast cancer in any other location. Therefore, this includes metastasis in contralateral lymph nodes and breast cancer involving the sternal bone. If feasible, tissue sampling of a first, solitary, lesion suspected for metastasis is highly recommended. Conclusion This project resulted in consensus-based event definitions for classification of recurrence in breast cancer research. Future breast cancer research projects should adopt these definitions to increase transparency. This should facilitate comparison of results and conducting reviews as well as meta-analysis. PMID:25381395
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2016-01-01
A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Top-quark mass measurement using events with missing transverse energy and jets at CDF.
Aaltonen, T; Álvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Appel, J A; Apresyan, A; Arisawa, T; Artikov, A; Asaadi, J; Ashmanskas, W; Auerbach, B; Aurisano, A; Azfar, F; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Barria, P; Bartos, P; Bauce, M; Bauer, G; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Bland, K R; Blumenfeld, B; Bocci, A; Bodek, A; Bortoletto, D; Boudreau, J; Boveia, A; Brigliadori, L; Brisuda, A; Bromberg, C; Brucken, E; Bucciantonio, M; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Calancha, C; Camarda, S; Campanelli, M; Campbell, M; Canelli, F; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Chung, W H; Chung, Y S; Ciobanu, C I; Ciocci, M A; Clark, A; Clarke, C; Compostella, G; Convery, M E; Conway, J; Corbo, M; Cordelli, M; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Dagenhart, D; d'Ascenzo, N; Datta, M; de Barbaro, P; De Cecco, S; De Lorenzo, G; Dell'Orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Devoto, F; d'Errico, M; Di Canto, A; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Dorigo, M; Dorigo, T; Ebina, K; Elagin, A; Eppig, A; Erbacher, R; Errede, D; Errede, S; Ershaidat, N; Eusebi, R; Fang, H C; Farrington, S; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Funakoshi, Y; Furic, I; Gallinaro, M; Galyardt, J; Garcia, J E; Garfinkel, A F; Garosi, P; Gerberich, H; Gerchtein, E; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Ginsburg, C M; Giokaris, N; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldin, D; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Grinstein, S; Grosso-Pilcher, C; Group, R C; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, S R; Halkiadakis, E; Hamaguchi, A; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harr, R F; Hatakeyama, K; Hays, C; Heck, M; Heinrich, J; Herndon, M; Hewamanage, S; Hidas, D; Hocker, A; Hopkins, W; Horn, D; Hou, S; Hughes, R E; Hurwitz, M; Husemann, U; Hussain, N; Hussein, M; Huston, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jang, D; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Junk, T R; Kamon, T; Karchin, P E; Kasmi, A; Kato, Y; Ketchum, W; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirby, M; Klimenko, S; Kondo, K; Kong, D J; Konigsberg, J; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kuhr, T; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; LeCompte, T; Lee, E; Lee, H S; Lee, J S; Lee, S W; Leo, S; Leone, S; Lewis, J D; Limosani, A; Lin, C-J; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, Q; Liu, T; Lockwitz, S; Loginov, A; Lucchesi, D; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lys, J; Lysak, R; Madrak, R; Maeshima, K; Makhoul, K; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Martínez, M; Martínez-Ballarín, R; Mastrandrea, P; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Mesropian, C; Miao, T; Mietlicki, D; Mitra, A; Miyake, H; Moed, S; Moggi, N; Mondragon, M N; Moon, C S; Moore, R; Morello, M J; Morlock, J; Movilla Fernandez, P; Mukherjee, A; Muller, Th; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Naganoma, J; Nakano, I; Napier, A; Nett, J; Neu, C; Neubauer, M S; Nielsen, J; Nodulman, L; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Ortolan, L; Pagan Griso, S; Pagliarone, C; Palencia, E; Papadimitriou, V; Paramonov, A A; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pilot, J; Pitts, K; Plager, C; Pondrom, L; Poprocki, S; Potamianos, K; Poukhov, O; Prokoshin, F; Pronko, A; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Renton, P; Rescigno, M; Riddick, T; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rubbo, F; Ruffini, F; Ruiz, A; Russ, J; Rusu, V; Safonov, A; Sakumoto, W K; Sakurai, Y; Santi, L; Sartori, L; Sato, K; Saveliev, V; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shreyber, I; Simonenko, A; Sinervo, P; Sissakian, A; Sliwa, K; Smith, J R; Snider, F D; Soha, A; Somalwar, S; Sorin, V; Squillacioti, P; Stancari, M; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Sudo, Y; Sukhanov, A; Suslov, I; Takemasa, K; Takeuchi, Y; Tang, J; Tecchio, M; Teng, P K; Thom, J; Thome, J; Thompson, G A; Thomson, E; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Trovato, M; Tu, Y; Ukegawa, F; Uozumi, S; Varganov, A; Vázquez, F; Velev, G; Vellidis, C; Vidal, M; Vila, I; Vilar, R; Vizán, J; Vogel, M; Volpi, G; Wagner, P; Wagner, R L; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Wick, F; Williams, H H; Wilson, J S; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, H; Wright, T; Wu, X; Wu, Z; Yamamoto, K; Yamaoka, J; Yang, T; Yang, U K; Yang, Y C; Yao, W-M; Yeh, G P; Yi, K; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanetti, A; Zeng, Y; Zucchelli, S
2011-12-02
We present a measurement of the top-quark mass using a sample of t ̄t events in 5.7 fb(-1) of integrated luminosity from p ̄p collisions at the Fermilab Tevatron with √s=1.96 TeV and collected by the CDF II Detector. We select events having large missing transverse energy, and four, five, or six jets with at least one jet tagged as coming from a b quark, and reject events with identified charged leptons. This analysis considers events from the semileptonic t ̄t decay channel, including events that contain tau leptons. The measurement is based on a multidimensional template method. We fit the data to signal templates of varying top-quark masses and background templates, and measure a top-quark mass of M(top)=172.32±2.4(stat)±1.0(syst) GeV/c(2). © 2011 American Physical Society
rpsftm: An R Package for Rank Preserving Structural Failure Time Models
Allison, Annabel; White, Ian R; Bond, Simon
2018-01-01
Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ, is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z(ψ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm. PMID:29564164
rpsftm: An R Package for Rank Preserving Structural Failure Time Models.
Allison, Annabel; White, Ian R; Bond, Simon
2017-12-04
Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ , is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z ( ψ ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Song-Hua; Chang, James Y. H.; Boring,Ronald L.
2010-03-01
The Office of Nuclear Regulatory Research (RES) at the US Nuclear Regulatory Commission (USNRC) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method's middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identifiedmore » human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.« less
Machine Learning Methods for Production Cases Analysis
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.
2018-03-01
Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.
Analysis of the Noise in Data from the Mt. Meron Array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, D. H.; Breitfeller, E.
2010-07-15
This memo describes an analysis of the noise in data obtained from the Mt. Meron seismic array in northern Israel. The overall objective is to development a method for removing noise from extraneous sources in the environment, increasing the sensitivity to seismic signals from far events. For this initial work, we concentrated on understanding the propagation characteristics of the noise in the frequency band from 0.1 – 8 Hz, and testing a model-based method for removing narrow band (single frequency) noise.
NASA Astrophysics Data System (ADS)
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-11-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.
CREME96 and Related Error Rate Prediction Methods
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.
2012-01-01
Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.
Factor and prevention method of landslide event at FELCRA Semungkis, Hulu Langat, Selangor
NASA Astrophysics Data System (ADS)
Manap, N.; Jeyaramah, N.; Syahrom, N.
2017-12-01
Landslide is known as one of the powerful geological events that happens unpredictably due to natural or human factors. A study was carried out at FELCRA Semungkis, Hulu Langat which is known as one of the areas that has been affected by landslide that involving 16 causalities. The purpose of this study is to identify the main factor that causes the landslide at FELCRA Semungkis, Hulu Langat and to identify the protection method. Data was collected from three respondents working under government bodies through interview sessions. The data collected were analysed by using the content analysis method. From the results, it can be concluded that the main factors that caused the landslide to happened are the human factor and nature factor. The protection method that can be applied to stabilize the FELCRA Semungkis, Hulu Langat is by using the soil nailing method with the support of soil create system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
A Cost-Utility Model of Care for Peristomal Skin Complications
Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. RESULTS: Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. CONCLUSIONS: In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components. PMID:26633166
Multivariate pattern analysis of fMRI: the early beginnings.
Haxby, James V
2012-08-15
In 2001, we published a paper on the representation of faces and objects in ventral temporal cortex that introduced a new method for fMRI analysis, which subsequently came to be called multivariate pattern analysis (MVPA). MVPA now refers to a diverse set of methods that analyze neural responses as patterns of activity that reflect the varying brain states that a cortical field or system can produce. This paper recounts the circumstances and events that led to the original study and later developments and innovations that have greatly expanded this approach to fMRI data analysis, leading to its widespread application. Copyright © 2012 Elsevier Inc. All rights reserved.
Event Recognition Based on Deep Learning in Chinese Texts
Zhang, Yajun; Liu, Zongtian; Zhou, Wen
2016-01-01
Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%. PMID:27501231
Event Recognition Based on Deep Learning in Chinese Texts.
Zhang, Yajun; Liu, Zongtian; Zhou, Wen
2016-01-01
Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.
Project delay analysis of HRSG
NASA Astrophysics Data System (ADS)
Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo
2017-08-01
Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.
Mathematical aspects of assessing extreme events for the safety of nuclear plants
NASA Astrophysics Data System (ADS)
Potempski, Slawomir; Borysiewicz, Mieczyslaw
2015-04-01
In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.
Improved spring load restriction guidelines using mechanistic analysis
DOT National Transportation Integrated Search
2000-07-01
This project used research to develop more effective criteria for placement and removal of spring load restrictions (SLR). Researchers investigated a method that uses a thawing index equation based on air temperatures to predict thawing events. Resul...
40 CFR 63.762 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1272 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.762 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1272 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
The El Nino/Southern Oscillation and Future Soybean Prices
NASA Technical Reports Server (NTRS)
Keppenne, C.
1993-01-01
Recently, it was shown that the application of a method combining singular spectrum analysis (SSA) and the maximum entropy method to univariate indicators of the coupled ocean-atmosphere El Nino/Southern Oscillation (ENSO) phenomenon can be helpful in determining whether an El Nino (EN) or La Nina (LN) event will occur. SSA - a variant of principal component analysis applied in the time domain - filters out variability unrelated to ENSO and separates the quasi-biennial (QB), two-to-three year variability, from a lower-frequency (LF) four-to-six year EN-LN cycle; the total variance associated with ENSO combines the QB and LF modes. ENSO has been known to affect weather conditions over much of the globe. For example, EN events have been connected with unusually rainy weather over the Central and Western US, while the opposite phases of the oscillation (LN) have been plausibly associated with extreme dry conditions over much of the same geographical area...
Schönweiler, R; Wübbelt, P; Tolloczko, R; Rose, C; Ptok, M
2000-01-01
Discriminant analysis (DA) and self-organizing feature maps (SOFM) were used to classify passively evoked auditory event-related potentials (ERP) P(1), N(1), P(2) and N(2). Responses from 16 children with severe behavioral auditory perception deficits, 16 children with marked behavioral auditory perception deficits, and 14 controls were examined. Eighteen ERP amplitude parameters were selected for examination of statistical differences between the groups. Different DA methods and SOFM configurations were trained to the values. SOFM had better classification results than DA methods. Subsequently, measures on another 37 subjects that were unknown for the trained SOFM were used to test the reliability of the system. With 10-dimensional vectors, reliable classifications were obtained that matched behavioral auditory perception deficits in 96%, implying central auditory processing disorder (CAPD). The results also support the assumption that CAPD includes a 'non-peripheral' auditory processing deficit. Copyright 2000 S. Karger AG, Basel.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Associating extreme precipitation events to parent cyclones in gridded data
NASA Astrophysics Data System (ADS)
Rhodes, Ruari; Shaffrey, Len; Gray, Sue
2015-04-01
When analysing the relationship of regional precipitation to its parent cyclone, it is insufficient to consider the cyclone's region of influence as a fixed radius from the centre due to the irregular shape of rain bands. A new method is therefore presented which allows the use of objective feature tracking data in the analysis of regional precipitation. Utilising the spatial extent of precipitation in gridded datasets, the most appropriate cyclone(s) may be associated with regional precipitation events. This method is applied in the context of an analysis of the influence of clustering and stalling of extra-tropical cyclones in the North Atlantic on total precipitation accumulations over England and Wales. Cyclone counts and residence times are presented for historical records (ERA-Interim) and future projections (HadGEM2-ES) of extreme (> 98th percentile) precipitation accumulations over England and Wales, for accumulation periods ranging from one day to one month.
SNIa detection in the SNLS photometric analysis using Morphological Component Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.
2015-04-01
Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000more » detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamışlıoğlu, Miraç, E-mail: m.kamislioglu@gmail.com; Külahcı, Fatih, E-mail: fatihkulahci@firat.edu.tr
Nonlinear time series analysis techniques have large application areas on the geoscience and geophysics fields. Modern nonlinear methods are provided considerable evidence for explain seismicity phenomena. In this study nonlinear time series analysis, fractal analysis and spectral analysis have been carried out for researching the chaotic behaviors of release radon gas ({sup 222}Rn) concentration occurring during seismic events. Nonlinear time series analysis methods (Lyapunov exponent, Hurst phenomenon, correlation dimension and false nearest neighbor) were applied for East Anatolian Fault Zone (EAFZ) Turkey and its surroundings where there are about 35,136 the radon measurements for each region. In this paper weremore » investigated of {sup 222}Rn behavior which it’s used in earthquake prediction studies.« less
Remote Sensing Analysis of Forest Disturbances
NASA Technical Reports Server (NTRS)
Asner, Gregory P. (Inventor)
2015-01-01
The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.
Remote sensing analysis of forest disturbances
NASA Technical Reports Server (NTRS)
Asner, Gregory P. (Inventor)
2012-01-01
The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
NASA Technical Reports Server (NTRS)
Waller, Jess M.; Saulsberry, Regor L.; Lucero, Ralph; Nichols, Charles T.; Wentzel, Daniel J.
2010-01-01
ASTM-based ILH methods were found to give a reproducible, quantitative estimate of the stress threshold at which significant accumulated damage began to occur. a) FR events are low energy (<2 V(exp 20 microsec) b) FR events occur close to the observed failure locus. c) FR events consist of more than 30% fiber breakage (>300 kHz) d) FR events show a consistent hierarchy of cooperative damage for composite tow, and for the COPV tested, regardless of applied load. Application of ILH or related stress profiles could lead to robust pass/fail acceptance criteria based on the FR. Initial application of FR and FFT analysis of AE data acquired on COPVs is promising.
Extreme risk assessment based on normalized historic loss data
NASA Astrophysics Data System (ADS)
Eichner, Jan
2017-04-01
Natural hazard risk assessment and risk management focuses on the expected loss magnitudes of rare and extreme events. Such large-scale loss events typically comprise all aspects of compound events and accumulate losses from multiple sectors (including knock-on effects). Utilizing Munich Re's NatCatSERVICE direct economic loss data, we beriefly recap a novel methodology of peril-specific loss data normalization which improves the stationarity properties of highly non-stationary historic loss data (due to socio-economic growth of assets prone to destructive forces), and perform extreme value analysis (peaks-over-threshold method) to come up with return level estimates of e.g. 100-yr loss event scenarios for various types of perils, globally or per continent, and discuss uncertainty in the results.
Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse
NASA Astrophysics Data System (ADS)
Lebert, F.; Bernardie, S.; Mainsant, G.
2011-09-01
One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.
Cross-scale analysis of cluster correspondence using different operational neighborhoods
NASA Astrophysics Data System (ADS)
Lu, Yongmei; Thill, Jean-Claude
2008-09-01
Cluster correspondence analysis examines the spatial autocorrelation of multi-location events at the local scale. This paper argues that patterns of cluster correspondence are highly sensitive to the definition of operational neighborhoods that form the spatial units of analysis. A subset of multi-location events is examined for cluster correspondence if they are associated with the same operational neighborhood. This paper discusses the construction of operational neighborhoods for cluster correspondence analysis based on the spatial properties of the underlying zoning system and the scales at which the zones are aggregated into neighborhoods. Impacts of this construction on the degree of cluster correspondence are also analyzed. Empirical analyses of cluster correspondence between paired vehicle theft and recovery locations are conducted on different zoning methods and across a series of geographic scales and the dynamics of cluster correspondence patterns are discussed.
Chesnokov, Yuriy V
2008-06-01
Paroxysmal atrial fibrillation (PAF) is a serious arrhythmia associated with morbidity and mortality. We explore the possibility of distant prediction of PAF by analyzing changes in heart rate variability (HRV) dynamics of non-PAF rhythms immediately before PAF event. We use that model for distant prognosis of PAF onset with artificial intelligence methods. We analyzed 30-min non-PAF HRV records from 51 subjects immediately before PAF onset and at least 45min distant from any PAF event. We used spectral and complexity analysis with sample (SmEn) and approximate (ApEn) entropies and their multiscale versions on extracted HRV data. We used that features to train the artificial neural networks (ANNs) and support vector machine (SVM) classifiers to differentiate the subjects. The trained classifiers were further tested for distant PAF event prognosis on 16 subjects from independent database on non-PAF rhythm lasting from 60 to 320 min before PAF onset classifying the 30-min segments as distant or leading to PAF. We found statistically significant increase in 30-min non-PAF HRV recordings from 51 subjects in the VLF, LF, HF bands and total power (p<0.0001) before PAF event compared to PAF distant ones. The SmEn and ApEn analysis provided significant decrease in complexity (p<0.0001 and p<0.001) before PAF onset. For training ANN and SVM classifiers the data from 51 subjects were randomly split to training, validation and testing. ANN provided better results in terms of sensitivity (Se), specificity (Sp) and positive predictivity (Pp) compared to SVM which became biased towards positive case. The validation results of the ANN classifier we achieved: Se 76%, Sp 93%, Pp 94%. Testing ANN and SVM classifiers on 16 subjects with non-PAF HRV data preceding PAF events we obtained distant prediction of PAF onset with SVM classifier in 10 subjects (58+/-18 min in advance). ANN classifier provided distant prediction of PAF event in 13 subjects (62+/-21 min in advance). From the results of distant PAF prediction we conclude that ANN and SVM classifiers learned the changes in the HRV dynamics immediately before PAF event and successfully identified them during distant PAF prognosis on independent database. This confirms the reported in the literature results that corresponding changes in the HRV data occur about 60 min before PAF onset and proves the possibility of distant PAF prediction with ANN and SVM methods.
Walton, Merrilyn; Smith-Merry, Jennifer; Harrison, Reema; Manias, Elizabeth; Iedema, Rick; Kelly, Patrick
2014-01-01
Introduction Evidence of patients’ experiences is fundamental to creating effective health policy and service responses, yet is missing from our knowledge of adverse events. This protocol describes explorative research redressing this significant deficit; investigating the experiences of a large cohort of recently hospitalised patients aged 45 years and above in hospitals in New South Wales (NSW), Australia. Methods and analysis The 45 and Up Study is a cohort of 265 000 adults aged 45 years and above in NSW. Patients who were hospitalised between 1 January and 30 June 2014 will be identified from this cohort using data linkage and a random sample of 20 000 invited to participate. A cross-sectional survey (including qualitative and quantitative components) will capture patients’ experiences in hospital and specifically of adverse events. Approximately 25% of respondents are likely to report experiencing an adverse event. Quantitative components will capture the nature and type of events as well as common features of patients’ experiences. Qualitative data provide contextual knowledge of their condition and care and the impact of the event on individuals. Respondents who do not report an adverse event will report their experience in hospital and be the control group. Statistical and thematic analysis will be used to present a patient perspective of their experiences in hospital; the characteristics of patients experiencing an adverse event; experiences of information sharing after an event (open disclosure) and the other avenues of redress pursued. Interviews with key policymakers and a document analysis will be used to create a map of the current practice. Ethics and dissemination Dissemination via a one-day workshop, peer-reviewed publications and conference presentations will enable effective clinical responses and service provision and policy responses to adverse events to be developed. PMID:25311039
Exploring the evolution of node neighborhoods in Dynamic Networks
NASA Astrophysics Data System (ADS)
Orman, Günce Keziban; Labatut, Vincent; Naskali, Ahmet Teoman
2017-09-01
Dynamic Networks are a popular way of modeling and studying the behavior of evolving systems. However, their analysis constitutes a relatively recent subfield of Network Science, and the number of available tools is consequently much smaller than for static networks. In this work, we propose a method specifically designed to take advantage of the longitudinal nature of dynamic networks. It characterizes each individual node by studying the evolution of its direct neighborhood, based on the assumption that the way this neighborhood changes reflects the role and position of the node in the whole network. For this purpose, we define the concept of neighborhood event, which corresponds to the various transformations such groups of nodes can undergo, and describe an algorithm for detecting such events. We demonstrate the interest of our method on three real-world networks: DBLP, LastFM and Enron. We apply frequent pattern mining to extract meaningful information from temporal sequences of neighborhood events. This results in the identification of behavioral trends emerging in the whole network, as well as the individual characterization of specific nodes. We also perform a cluster analysis, which reveals that, in all three networks, one can distinguish two types of nodes exhibiting different behaviors: a very small group of active nodes, whose neighborhood undergo diverse and frequent events, and a very large group of stable nodes.
NASA Astrophysics Data System (ADS)
Riyadi, Eko H.
2014-09-01
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Machine learning methods may complement traditional analytic methods for medical device surveillance. Using data from the National Cardiovascular Data Registry for implantable cardioverter-defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%-20.9%; nonfatal ICD-related adverse events, 19.3%-26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%-37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k =0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k =-0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k =-0.042). Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance.
Design and Deployment of a Pediatric Cardiac Arrest Surveillance System
Newton, Heather Marie; McNamara, Leann; Engorn, Branden Michael; Jones, Kareen; Bernier, Meghan; Dodge, Pamela; Salamone, Cheryl; Bhalala, Utpal; Jeffers, Justin M.; Engineer, Lilly; Diener-West, Marie; Hunt, Elizabeth Anne
2018-01-01
Objective We aimed to increase detection of pediatric cardiopulmonary resuscitation (CPR) events and collection of physiologic and performance data for use in quality improvement (QI) efforts. Materials and Methods We developed a workflow-driven surveillance system that leveraged organizational information technology systems to trigger CPR detection and analysis processes. We characterized detection by notification source, type, location, and year, and compared it to previous methods of detection. Results From 1/1/2013 through 12/31/2015, there were 2,986 unique notifications associated with 2,145 events, 317 requiring CPR. PICU and PEDS-ED accounted for 65% of CPR events, whereas floor care areas were responsible for only 3% of events. 100% of PEDS-OR and >70% of PICU CPR events would not have been included in QI efforts. Performance data from both defibrillator and bedside monitor increased annually. (2013: 1%; 2014: 18%; 2015: 27%). Discussion After deployment of this system, detection has increased ∼9-fold and performance data collection increased annually. Had the system not been deployed, 100% of PEDS-OR and 50–70% of PICU, NICU, and PEDS-ED events would have been missed. Conclusion By leveraging hospital information technology and medical device data, identification of pediatric cardiac arrest with an associated increased capture in the proportion of objective performance data is possible. PMID:29854451
Two Wrongs Make a Right: Addressing Underreporting in Binary Data from Multiple Sources.
Cook, Scott J; Blas, Betsabe; Carroll, Raymond J; Sinha, Samiran
2017-04-01
Media-based event data-i.e., data comprised from reporting by media outlets-are widely used in political science research. However, events of interest (e.g., strikes, protests, conflict) are often underreported by these primary and secondary sources, producing incomplete data that risks inconsistency and bias in subsequent analysis. While general strategies exist to help ameliorate this bias, these methods do not make full use of the information often available to researchers. Specifically, much of the event data used in the social sciences is drawn from multiple, overlapping news sources (e.g., Agence France-Presse, Reuters). Therefore, we propose a novel maximum likelihood estimator that corrects for misclassification in data arising from multiple sources. In the most general formulation of our estimator, researchers can specify separate sets of predictors for the true-event model and each of the misclassification models characterizing whether a source fails to report on an event. As such, researchers are able to accurately test theories on both the causes of and reporting on an event of interest. Simulations evidence that our technique regularly out performs current strategies that either neglect misclassification, the unique features of the data-generating process, or both. We also illustrate the utility of this method with a model of repression using the Social Conflict in Africa Database.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasmal, S.; Chakrabarti, S. K.; S. N. Bose National Centre for Basic Sciences, JD Block, Salt-Lake Kolkata-70098
2010-10-20
The VLF (Very Low Frequency) signals are long thought to give away important information about the Lithosphere-Ionosphere coupling. It is recently established that the ionosphere may be perturbed due to seismic activities. The effects of this perturbation can be detected through the VLF wave amplitude. There are several methods to find this correlations and these methods can be used for the prediction of these seismic events. In this paper, first we present a brief history of the use of VLF propagation method for the study of seismo-ionospheric correlations. Then we present different methods proposed by us to find out themore » seismo-ionospheric correlations. At the Indian Centre for Space Physics, Kolkata we have been monitoring the VTX station at Vijayanarayanam from 2002. In the initial stage, we received 17 kHz signal and latter we received 18.2 kHz signal. In this paper, first we present the results for the 17 kHz signal during Sumatra earthquake in 2004 obtained from the terminator time analysis method. Then we present much detailed and statistical analysis using some new methods and present the results for 18.2 kHz signal. In order to establish the correlation between the ionospheric activities and the earthquakes, we need to understand what are the reference signals throughout the year. We present the result of the sunrise and sunset terminators for the 18.2 kHz signal as a function of the day of the year for a period of four years, viz, 2005 to 2008 when the solar activity was very low. In this case, the signal would primarily be affected by the Sun due to normal sunrise and sunset effects. Any deviation from this standardized calibration curve would point to influences by terrestrial (such as earthquakes) and extra-terrestrial (such as solar activities and other high energy phenomena). We present examples of deviations which occur in a period of sixteen months and show that the correlations with seismic events is significant and typically the highest deviation in terminator shift takes place up to a couple of days prior to the seismic event. We introduce a new method where we find the effects of the seismic activities on D-layer preparation time (DLPT) and the D-layer disappearance time (DLDT). We identify those days in which DLPT and DLDT exhibit deviations from the average value and we correlate those days with seismic events. Separately, we compute the energy release by the earthquakes and using this, we compute the total energy released locally from distant earthquakes and find correlations of the deviations with them. In this case also we find pre-cursors a few days before the seismic events. In a third approach, we consider the nighttime fluctuation method (differently quantified than the conventional way). We analyzed the nighttime data for the year 2007 to check the correlation between the night time fluctuation of the signal amplitude and the seismic events. Using the statistical method for all the events of the year and for the individual individual earthquakes (Magnitude > 5) we found that the night time signal amplitude becomes very high on three days prior to the seismic events.« less
An integrated logit model for contamination event detection in water distribution systems.
Housh, Mashor; Ostfeld, Avi
2015-05-15
The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multifractal analysis of 2001 Mw 7 . 7 Bhuj earthquake sequence in Gujarat, Western India
NASA Astrophysics Data System (ADS)
Aggarwal, Sandeep Kumar; Pastén, Denisse; Khan, Prosanta Kumar
2017-12-01
The 2001 Mw 7 . 7 Bhuj mainshock seismic sequence in the Kachchh area, occurring during 2001 to 2012, has been analyzed using mono-fractal and multi-fractal dimension spectrum analysis technique. This region was characterized by frequent moderate shocks of Mw ≥ 5 . 0 for more than a decade since the occurrence of 2001 Bhuj earthquake. The present study is therefore important for precursory analysis using this sequence. The selected long-sequence has been investigated first time for completeness magnitude Mc 3.0 using the maximum curvature method. Multi-fractal Dq spectrum (Dq ∼ q) analysis was carried out using effective window-length of 200 earthquakes with a moving window of 20 events overlapped by 180 events. The robustness of the analysis has been tested by considering the magnitude completeness correction term of 0.2 to Mc 3.0 as Mc 3.2 and we have tested the error in the calculus of Dq for each magnitude threshold. On the other hand, the stability of the analysis has been investigated down to the minimum magnitude of Mw ≥ 2 . 6 in the sequence. The analysis shows the multi-fractal dimension spectrum Dq decreases with increasing of clustering of events with time before a moderate magnitude earthquake in the sequence, which alternatively accounts for non-randomness in the spatial distribution of epicenters and its self-organized criticality. Similar behavior is ubiquitous elsewhere around the globe, and warns for proximity of a damaging seismic event in an area. OS: Please confirm math roman or italics in abs.
NASA Astrophysics Data System (ADS)
Zhou, Cong; Chase, J. Geoffrey; Rodgers, Geoffrey W.; Xu, Chao
2017-02-01
The model-free hysteresis loop analysis (HLA) method for structural health monitoring (SHM) has significant advantages over the traditional model-based SHM methods that require a suitable baseline model to represent the actual system response. This paper provides a unique validation against both an experimental reinforced concrete (RC) building and a calibrated numerical model to delineate the capability of the model-free HLA method and the adaptive least mean squares (LMS) model-based method in detecting, localizing and quantifying damage that may not be visible, observable in overall structural response. Results clearly show the model-free HLA method is capable of adapting to changes in how structures transfer load or demand across structural elements over time and multiple events of different size. However, the adaptive LMS model-based method presented an image of greater spread of lesser damage over time and story when the baseline model is not well defined. Finally, the two algorithms are tested over a simpler hysteretic behaviour typical steel structure to quantify the impact of model mismatch between the baseline model used for identification and the actual response. The overall results highlight the need for model-based methods to have an appropriate model that can capture the observed response, in order to yield accurate results, even in small events where the structure remains linear.
Opinion data mining based on DNA method and ORA software
NASA Astrophysics Data System (ADS)
Tian, Ru-Ya; Wu, Lei; Liang, Xiao-He; Zhang, Xue-Fu
2018-01-01
Public opinion, especially the online public opinion is a critical issue when it comes to mining its characteristics. Because it can be formed directly and intensely in a short time, and may lead to the outbreak of online group events, and the formation of online public opinion crisis. This may become the pushing hand of a public crisis event, or even have negative social impacts, which brings great challenges to the government management. Data from the mass media which reveal implicit, previously unknown, and potentially valuable information, can effectively help us to understand the evolution law of public opinion, and provide a useful reference for rumor intervention. Based on the Dynamic Network Analysis method, this paper uses ORA software to mine characteristics of public opinion information, opinion topics, and public opinion agents through a series of indicators, and quantitatively analyzed the relationships between them. The results show that through the analysis of the 8 indexes associating with opinion data mining, we can have a basic understanding of the public opinion characteristics of an opinion event, such as who is important in the opinion spreading process, the information grasping condition, and the opinion topics release situation.
Computational Electrocardiography: Revisiting Holter ECG Monitoring.
Deserno, Thomas M; Marx, Nikolaus
2016-08-05
Since 1942, when Goldberger introduced the 12-lead electrocardiography (ECG), this diagnostic method has not been changed. After 70 years of technologic developments, we revisit Holter ECG from recording to understanding. A fundamental change is fore-seen towards "computational ECG" (CECG), where continuous monitoring is producing big data volumes that are impossible to be inspected conventionally but require efficient computational methods. We draw parallels between CECG and computational biology, in particular with respect to computed tomography, computed radiology, and computed photography. From that, we identify technology and methodology needed for CECG. Real-time transfer of raw data into meaningful parameters that are tracked over time will allow prediction of serious events, such as sudden cardiac death. Evolved from Holter's technology, portable smartphones with Bluetooth-connected textile-embedded sensors will capture noisy raw data (recording), process meaningful parameters over time (analysis), and transfer them to cloud services for sharing (handling), predicting serious events, and alarming (understanding). To make this happen, the following fields need more research: i) signal processing, ii) cycle decomposition; iii) cycle normalization, iv) cycle modeling, v) clinical parameter computation, vi) physiological modeling, and vii) event prediction. We shall start immediately developing methodology for CECG analysis and understanding.
The Impact of Future World Events on Iranians’ Social Health: A Qualitative Futurology
DAMARI, Behzad; HAJIAN, Maryam; MINAEE, Farima; RIAZI-ISFAHANI, Sahand
2016-01-01
Background: Social health is a dimension of health affected and interacts with other dimensions. Considering the rate of world changes, foresighting the influence of future events and possible trends on social health could bring about advantageous information for social policy makers. Methods: This is a qualitative study of futurology with cross impact analysis approach. After studying possible trends and events in future, they were categorized in four domains including population, resources, climate changes, and globalization and 12 groups of events; and they were used to design a questionnaire. It was given to experts and their opinions were collected through depth interviews between May 2013 and Sep 2013. Results: Analysis of experts’ opinions reveals that future trends in four main potential domains may have some positive and more negative impacts on Iranians’ social health. Conclusion: The global “resource challenge” is the most important incoming event, considering to the four domains of global events and its final and potential effects will be the increase of inequalities leading to social threat. Since inequalities are considered the most important risk factor of health in the societies, the solution for dispel the impact of world trends on Iranians’ social health is managing the crisis of inequalities which is started with fore sighting and adopting preventive strategies in all four domains. PMID:27648424
Hsu, F.C.; de Boer, I.H.; Ix, J.H.; Siscovick, D.; Szklo, M.; Burke, G.L.; Frazier-Wood, A.C.; Herrington, D.M.
2016-01-01
Background and Aims Many US adults use calcium supplements to address inadequate dietary intake and improve bone health. However, recent reports have suggested that use of calcium supplements may elevate cardiovascular disease (CVD) risk. In this study, we examined associations between baseline calcium supplement use and incident myocardial infarction (MI) (n=208 events) and CVD events (n=641 events) over 10.3 years in men and women from the Multi-Ethnic Study of Atherosclerosis (MESA) cohort (n=6,236), with dietary calcium intake at baseline also examined as a supplementary objective. Methods and Results Using Cox proportional hazards models, no compelling associations between calcium intake from supplements or diet and incident CVD events were observed upon multivariate adjustment for potential confounders. An association with lower MI risk was observed comparing those with low levels of calcium supplement use (1-499 mg) to those using no calcium supplements (hazard ratio 0.69, 95% CI 0.48, 0.98, p=0.039). Relationships were homogeneous by gender, race/ethnicity, or chronic kidney disease. Results were also similar when the analysis was limited to postmenopausal women only. Conclusion Analysis of incident MI and CVD events in the MESA cohort does not support a substantial association of calcium supplement use with negative cardiovascular outcomes. PMID:27514606
Cores Of Recurrent Events (CORE) | Informatics Technology for Cancer Research (ITCR)
CORE is a statistically supported computational method for finding recurrently targeted regions in massive collections of genomic intervals, such as those arising from DNA copy number analysis of single tumor cells or bulk tumor tissues.
40 CFR 60.286a - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., you may assert an affirmative defense to a claim for civil penalties for violations of such standards... malfunction event at issue. The analysis must also specify, using best monitoring methods and engineering...
40 CFR 63.1344 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... defense to a claim for civil penalties for violations of such standards that are caused by malfunction, as... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1344 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... defense to a claim for civil penalties for violations of such standards that are caused by malfunction, as... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Hazardous Air Pollutants for Industrial, Commercial, and Institutional Boilers Area Sources Continuous... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
Jordá Aragón, Carlos; Peñalver Cuesta, Juan Carlos; Mancheño Franch, Nuria; de Aguiar Quevedo, Karol; Vera Sempere, Francisco; Padilla Alarcón, José
2015-09-07
Survival studies of non-small cell lung cancer (NSCLC) are usually based on the Kaplan-Meier method. However, other factors not covered by this method may modify the observation of the event of interest. There are models of cumulative incidence (CI), that take into account these competing risks, enabling more accurate survival estimates and evaluation of the risk of death from other causes. We aimed to evaluate these models in resected early-stage NSCLC patients. This study included 263 patients with resected NSCLC whose diameter was ≤ 3 cm without node involvement (N0). Demographic, clinical, morphopathological and surgical variables, TNM classification and long-term evolution were analysed. To analyse CI, death by another cause was considered to be competitive event. For the univariate analysis, Gray's method was used, while Fine and Gray's method was employed for the multivariate analysis. Mortality by NSCLC was 19.4% at 5 years and 14.3% by another cause. Both curves crossed at 6.3 years, and probability of death by another cause became greater from this point. In multivariate analysis, cancer mortality was conditioned by visceral pleural invasion (VPI) (P=.001) and vascular invasion (P=.020), with age>50 years (P=.034), smoking (P=.009) and the Charlson index ≥ 2 (P=.000) being by no cancer. By the method of CI, VPI and vascular invasion conditioned cancer death in NSCLC >3 cm, while non-tumor causes of long-term death were determined. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Huang, Ying; Wang, Lin; Malfertheiner, Peter
2015-01-01
Background Eradication rates with triple therapy for Helicobacter pylori infections have currently declined to unacceptable levels worldwide. Newer quadruple therapies are burdened with a high rate of adverse events. Whether multi-strain probiotics can improve eradication rates or diminish adverse events remains uncertain. Methods Relevant publications in which patients with H. pylori infections were randomized to a multi-strain probiotic or control were identified in PubMed, Cochrane Databases, and other sources from 1 January 1960–3 June 2015. Primary outcomes included eradication rates, incidence of any adverse event and the incidence of antibiotic-associated diarrhea. As probiotic efficacy is strain-specific, pooled relative risks and 95% confidence intervals were calculated using meta-analysis stratified by similar multi-strain probiotic mixtures. Results A total of 19 randomized controlled trials (20 treatment arms, n = 2730) assessing one of six mixtures of strains of probiotics were included. Four multi-strain probiotics significantly improved H. pylori eradication rates, five significantly prevented any adverse reactions and three significantly reduced antibiotic-associated diarrhea. Only two probiotic mixtures (Lactobacillus acidophilus/Bifidobacterium animalis and an eight-strain mixture) had significant efficacy for all three outcomes. Conclusions Our meta-analysis found adjunctive use of some multi-strain probiotics may improve H. pylori eradication rates and prevent the development of adverse events and antibiotic-associated diarrhea, but not all mixtures were effective. PMID:27536365
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel
2011-11-01
We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.
Single event mass spectrometry
Conzemius, Robert J.
1990-01-16
A means and method for single event time of flight mass spectrometry for analysis of specimen materials. The method of the invention includes pulsing an ion source imposing at least one pulsed ion onto the specimen to produce a corresponding emission of at least one electrically charged particle. The emitted particle is then dissociated into a charged ion component and an uncharged neutral component. The ion and neutral components are then detected. The time of flight of the components are recorded and can be used to analyze the predecessor of the components, and therefore the specimen material. When more than one ion particle is emitted from the specimen per single ion impact, the single event time of flight mass spectrometer described here furnis This invention was made with Government support under Contract No. W-7405-ENG82 awarded by the Department of Energy. The Government has certain rights in the invention.
Probabilistic modelling of flood events using the entropy copula
NASA Astrophysics Data System (ADS)
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
NASA Astrophysics Data System (ADS)
Gu, Chen; Marzouk, Youssef M.; Toksöz, M. Nafi
2018-03-01
Small earthquakes occur due to natural tectonic motions and are induced by oil and gas production processes. In many oil/gas fields and hydrofracking processes, induced earthquakes result from fluid extraction or injection. The locations and source mechanisms of these earthquakes provide valuable information about the reservoirs. Analysis of induced seismic events has mostly assumed a double-couple source mechanism. However, recent studies have shown a non-negligible percentage of non-double-couple components of source moment tensors in hydraulic fracturing events, assuming a full moment tensor source mechanism. Without uncertainty quantification of the moment tensor solution, it is difficult to determine the reliability of these source models. This study develops a Bayesian method to perform waveform-based full moment tensor inversion and uncertainty quantification for induced seismic events, accounting for both location and velocity model uncertainties. We conduct tests with synthetic events to validate the method, and then apply our newly developed Bayesian inversion approach to real induced seismicity in an oil/gas field in the sultanate of Oman—determining the uncertainties in the source mechanism and in the location of that event.
Misspecification of Cox regression models with composite endpoints
Wu, Longyang; Cook, Richard J
2012-01-01
Researchers routinely adopt composite endpoints in multicenter randomized trials designed to evaluate the effect of experimental interventions in cardiovascular disease, diabetes, and cancer. Despite their widespread use, relatively little attention has been paid to the statistical properties of estimators of treatment effect based on composite endpoints. We consider this here in the context of multivariate models for time to event data in which copula functions link marginal distributions with a proportional hazards structure. We then examine the asymptotic and empirical properties of the estimator of treatment effect arising from a Cox regression model for the time to the first event. We point out that even when the treatment effect is the same for the component events, the limiting value of the estimator based on the composite endpoint is usually inconsistent for this common value. We find that in this context the limiting value is determined by the degree of association between the events, the stochastic ordering of events, and the censoring distribution. Within the framework adopted, marginal methods for the analysis of multivariate failure time data yield consistent estimators of treatment effect and are therefore preferred. We illustrate the methods by application to a recent asthma study. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22736519
Fault tree analysis for system modeling in case of intentional EMI
NASA Astrophysics Data System (ADS)
Genender, E.; Mleczko, M.; Döring, O.; Garbe, H.; Potthast, S.
2011-08-01
The complexity of modern systems on the one hand and the rising threat of intentional electromagnetic interference (IEMI) on the other hand increase the necessity for systematical risk analysis. Most of the problems can not be treated deterministically since slight changes in the configuration (source, position, polarization, ...) can dramatically change the outcome of an event. For that purpose, methods known from probabilistic risk analysis can be applied. One of the most common approaches is the fault tree analysis (FTA). The FTA is used to determine the system failure probability and also the main contributors to its failure. In this paper the fault tree analysis is introduced and a possible application of that method is shown using a small computer network as an example. The constraints of this methods are explained and conclusions for further research are drawn.
Distinguishing signatures of determinism and stochasticity in spiking complex systems
Aragoneses, Andrés; Rubido, Nicolás; Tiana-Alsina, Jordi; Torrent, M. C.; Masoller, Cristina
2013-01-01
We describe a method to infer signatures of determinism and stochasticity in the sequence of apparently random intensity dropouts emitted by a semiconductor laser with optical feedback. The method uses ordinal time-series analysis to classify experimental data of inter-dropout-intervals (IDIs) in two categories that display statistically significant different features. Despite the apparent randomness of the dropout events, one IDI category is consistent with waiting times in a resting state until noise triggers a dropout, and the other is consistent with dropouts occurring during the return to the resting state, which have a clear deterministic component. The method we describe can be a powerful tool for inferring signatures of determinism in the dynamics of complex systems in noisy environments, at an event-level description of their dynamics.
Lifting Events in RDF from Interactions with Annotated Web Pages
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.
Gabbay, Itay E; Gabbay, Uri
2013-01-01
Excess adverse events may be attributable to poor surgical performance but also to case-mix, which is controlled through the Standardized Incidence Ratio (SIR). SIR calculations can be complicated, resource consuming, and unfeasible in some settings. This article suggests a novel method for SIR approximation. In order to evaluate a potential SIR surrogate measure we predefined acceptance criteria. We developed a new measure - Approximate Risk Index (ARI). "Number Needed for Event" (NNE) is the theoretical number of patients needed "to produce" one adverse event. ARI is defined as the quotient of the group of patients needed for no observed events Ge by total patients treated Ga. Our evaluation compared 2500 surgical units and over 3 million heterogeneous risk surgical patients that were induced through a computerized simulation. Surgical unit's data were computed for SIR and ARI to evaluate compliance with the predefined criteria. Approximation was evaluated by correlation analysis and performance prediction capability by Receiver Operating Characteristics (ROC) analysis. ARI strongly correlates with SIR (r(2) = 0.87, p < 0.05). ARI prediction of excessive risk revealed excellent ROC (Area Under the Curve > 0.9) 87% sensitivity and 91% specificity. ARI provides good approximation of SIR and excellent prediction capability. ARI is simple and cost-effective as it requires thorough risk evaluation of only the adverse events patients. ARI can provide a crucial screening and performance evaluation quality control tool. The ARI method may suit other clinical and epidemiological settings where relatively small fraction of the entire population is affected. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Community resilience and decision theory challenges for catastrophic events.
Cox, Louis Anthony
2012-11-01
Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd-following, group-think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard-to-envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building "disaster-resilient" communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events. © 2012 Society for Risk Analysis.
Aarabi, Ardalan; Osharina, Victoria; Wallois, Fabrice
2017-07-15
Slow and rapid event-related designs are used in fMRI and functional near-infrared spectroscopy (fNIRS) experiments to temporally characterize the brain hemodynamic response to discrete events. Conventional averaging (CA) and the deconvolution method (DM) are the two techniques commonly used to estimate the Hemodynamic Response Function (HRF) profile in event-related designs. In this study, we conducted a series of simulations using synthetic and real NIRS data to examine the effect of the main confounding factors, including event sequence timing parameters, different types of noise, signal-to-noise ratio (SNR), temporal autocorrelation and temporal filtering on the performance of these techniques in slow and rapid event-related designs. We also compared systematic errors in the estimates of the fitted HRF amplitude, latency and duration for both techniques. We further compared the performance of deconvolution methods based on Finite Impulse Response (FIR) basis functions and gamma basis sets. Our results demonstrate that DM was much less sensitive to confounding factors than CA. Event timing was the main parameter largely affecting the accuracy of CA. In slow event-related designs, deconvolution methods provided similar results to those obtained by CA. In rapid event-related designs, our results showed that DM outperformed CA for all SNR, especially above -5 dB regardless of the event sequence timing and the dynamics of background NIRS activity. Our results also show that periodic low-frequency systemic hemodynamic fluctuations as well as phase-locked noise can markedly obscure hemodynamic evoked responses. Temporal autocorrelation also affected the performance of both techniques by inducing distortions in the time profile of the estimated hemodynamic response with inflated t-statistics, especially at low SNRs. We also found that high-pass temporal filtering could substantially affect the performance of both techniques by removing the low-frequency components of HRF profiles. Our results emphasize the importance of characterization of event timing, background noise and SNR when estimating HRF profiles using CA and DM in event-related designs. Copyright © 2017 Elsevier Inc. All rights reserved.
2013-01-01
Background The fovea, which is the most sensitive part of the retina, is known to have birefringent properties, i.e. it changes the polarization state of light upon reflection. Existing devices use this property to obtain information on the orientation of the fovea and the direction of gaze. Such devices employ specific frequency components that appear during moments of fixation on a target. To detect them, previous methods have used solely the power spectrum of the Fast Fourier Transform (FFT), which, unfortunately, is an integral method, and does not give information as to where exactly the events of interest occur. With very young patients who are not cooperative enough, this presents a problem, because central fixation may be present only during very short-lasting episodes, and can easily be missed by the FFT. Method This paper presents a method for detecting short-lasting moments of central fixation in existing devices for retinal birefringence scanning, with the goal of a reliable detection of eye alignment. Signal analysis is based on the Continuous Wavelet Transform (CWT), which reliably localizes such events in the time-frequency plane. Even though the characteristic frequencies are not always strongly expressed due to possible artifacts, simple topological analysis of the time-frequency distribution can detect fixation reliably. Results In all six subjects tested, the CWT allowed precise identification of both frequency components. Moreover, in four of these subjects, episodes of intermittent but definitely present central fixation were detectable, similar to those in Figure 4. A simple FFT is likely to treat them as borderline cases, or entirely miss them, depending on the thresholds used. Conclusion Joint time-frequency analysis is a powerful tool in the detection of eye alignment, even in a noisy environment. The method is applicable to similar situations, where short-lasting diagnostic events need to be detected in time series acquired by means of scanning some substrate along a specific path. PMID:23668264
NASA Astrophysics Data System (ADS)
Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain
2015-04-01
We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK
Uncertainty analysis in fault tree models with dependent basic events.
Pedroni, Nicola; Zio, Enrico
2013-06-01
In general, two types of dependence need to be considered when estimating the probability of the top event (TE) of a fault tree (FT): "objective" dependence between the (random) occurrences of different basic events (BEs) in the FT and "state-of-knowledge" (epistemic) dependence between estimates of the epistemically uncertain probabilities of some BEs of the FT model. In this article, we study the effects on the TE probability of objective and epistemic dependences. The well-known Frèchet bounds and the distribution envelope determination (DEnv) method are used to model all kinds of (possibly unknown) objective and epistemic dependences, respectively. For exemplification, the analyses are carried out on a FT with six BEs. Results show that both types of dependence significantly affect the TE probability; however, the effects of epistemic dependence are likely to be overwhelmed by those of objective dependence (if present). © 2012 Society for Risk Analysis.
A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment
NASA Astrophysics Data System (ADS)
Wu, Wenyan; Westra, Seth; Leonard, Michael
2017-04-01
Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability analysis for flood risk analysis considering multiple flood producing mechanisms. This is the first step in applying a Monte Carlo based joint probability method for flood risk assessment.
Brodscholl, A.; Kirbani, S.B.; Voight, B.
2000-01-01
The broadband data were evaluated using the assumption that avalanches with the same source areas and descent paths exhibit a linear relation between source volume and recorded seismic-amplitude envelope area. A result of the analysis is the determination of the volume of selected individual events. From the field surveys, the total volume of the collapsed dome lava is 2.6 Mm3. Discounting the volumetric influence of rockfalls, the average size of the 44 nuées ardentes is therefore about 60,000 m3. The largest collapse event at 10:54 is estimated to involve 260,000 m3, based on an analysis of the seismicity. The remaining 23 phase I events averaged 60,000 m3, with the total volume of all phase I events accounting for 63% of the unstable dome. The 20 phase II events comprised 37% of the total volume and averaged 47,000 m3. The methods described here can be put to practical use in real-time monitoring situations. Broadband data were essential in this study primarily because of the wide dynamic range.
StackSplit - a plugin for multi-event shear wave splitting analyses in SplitLab
NASA Astrophysics Data System (ADS)
Grund, Michael
2017-08-01
SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.
Barteneva, Natasha S; Vorobjev, Ivan A
2018-01-01
In this paper, we review some of the recent advances in cellular heterogeneity and single-cell analysis methods. In modern research of cellular heterogeneity, there are four major approaches: analysis of pooled samples, single-cell analysis, high-throughput single-cell analysis, and lately integrated analysis of cellular population at a single-cell level. Recently developed high-throughput single-cell genetic analysis methods such as RNA-Seq require purification step and destruction of an analyzed cell often are providing a snapshot of the investigated cell without spatiotemporal context. Correlative analysis of multiparameter morphological, functional, and molecular information is important for differentiation of more uniform groups in the spectrum of different cell types. Simplified distributions (histograms and 2D plots) can underrepresent biologically significant subpopulations. Future directions may include the development of nondestructive methods for dissecting molecular events in intact cells, simultaneous correlative cellular analysis of phenotypic and molecular features by hybrid technologies such as imaging flow cytometry, and further progress in supervised and non-supervised statistical analysis algorithms.
NASA Astrophysics Data System (ADS)
Kim, Kyungsik; Lee, Dong-In
2013-04-01
There is considerable interest in cross-correlations in collective modes of real data from atmospheric geophysics, seismology, finance, physiology, genomics, and nanodevices. If two systems interact mutually, that interaction gives rise to collective modes. This phenomenon is able to be analyzed using the cross-correlation of traditional methods, random matrix theory, and the detrended cross-correlation analysis method. The detrended cross-correlation analysis method was used in the past to analyze several models such as autoregressive fractionally integrated moving average processes, stock prices and their trading volumes, and taxi accidents. Particulate matter is composed of the organic and inorganic mixtures such as the natural sea salt, soil particle, vehicles exhaust, construction dust, and soot. The PM10 is known as the particle with the aerodynamic diameter (less than 10 microns) that is able to enter the human respiratory system. The PM10 concentration has an effect on the climate change by causing an unbalance of the global radiative equilibrium through the direct effect that blocks the stoma of plants and cuts off the solar radiation, different from the indirect effect that changes the optical property of clouds, cloudiness, and lifetime of clouds. Various factors contribute to the degree of the PM10 concentration. Notable among these are the land-use types, surface vegetation coverage, as well as meteorological factors. In this study, we analyze and simulate cross-correlations in time scales between the PM10 concentration and the meteorological factor (among temperature, wind speed and humidity) using the detrended cross-correlation analysis method through the removal of specific trends at eight cities in the Korean peninsula. We divide time series data into Asian dust events and non-Asian dust events to analyze the change of meteorological factors on the fluctuation of PM10 the concentration during Asian dust events. In particular, our result is compared to analytic findings from references published in all nations. ----------------------------------------------------------------- This work was supported by Center for the ASER (CATER 2012-6110) and by the NRFK through a grant provided by the KMEST(No.K1663000201107900).
Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms
NASA Astrophysics Data System (ADS)
Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.
2017-12-01
Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.
Investigation of the Mechanism of Roof Caving in the Jinchuan Nickel Mine, China
NASA Astrophysics Data System (ADS)
Ding, Kuo; Ma, Fengshan; Guo, Jie; Zhao, Haijun; Lu, Rong; Liu, Feng
2018-04-01
On 13 March 2016, a sudden, violent roof caving event with a collapse area of nearly 11,000 m2 occurred in the Jinchuan Nickel Mine and accompanied by air blasts, loud noises and ground vibrations. This collapse event coincided with related, conspicuous surface subsidence across an area of nearly 19,000 m2. This article aims to analyse this collapse event. In previous studies, various mining-induced collapses have been studied, but collapse accidents associated with the filling mining method are very rare and have not been thoroughly studied. The filling method has been regarded as a safe mining method for a long time, so research on associated collapse mechanisms is of considerable significance. In this study, a detailed field investigation of roadway damage was performed, and GPS monitoring results were used to analyse the surface failure. In addition, a numerical model was constructed based on the geometry of the ore body and a major fault. The analysis of the model revealed three failure mechanisms acting during different stages of destruction: double-sided embedded beam deformation, fault activation, and cantilever-articulated rock beam failure. The fault activation and the specific filling method are the key factors of this collapse event. To gain a better understanding of these factors, the shear stress and normal stress along the fault plane were monitored to determine the variation in stress at different failure stages. Discrete element models were established to study two filling methods and to analyse the stability of different filling structures.
Characterization of the Space Shuttle Ascent Debris using CFD Methods
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Aftosmis, Michael J.; Rogers, Stuart E.
2005-01-01
After video analysis of space shuttle flight STS-107's ascent showed that an object shed from the bipod-ramp region impacted the left wing, a transport analysis was initiated to determine a credible flight path and impact velocity for the piece of debris. This debris transport analysis was performed both during orbit, and after the subsequent re-entry accident. The analysis provided an accurate prediction of the velocity a large piece of foam bipod ramp would have as it impacted the wing leading edge. This prediction was corroborated by video analysis and fully-coupled CFD/six degree of freedom (DOF) simulations. While the prediction of impact velocity was accurate enough to predict critical damage in this case, one of the recommendations of the Columbia Accident Investigation Board (CAIB) for return-to-flight (RTF) was to analyze the complete debris environment experienced by the shuttle stack on ascent. This includes categorizing all possible debris sources, their probable geometric and aerodynamic characteristics, and their potential for damage. This paper is chiefly concerned with predicting the aerodynamic characteristics of a variety of potential debris sources (insulating foam and cork, nose-cone ablator, ice, ...) for the shuttle ascent configuration using CFD methods. These aerodynamic characteristics are used in the debris transport analysis to predict flight path, impact velocity and angle, and provide statistical variation to perform risk analyses where appropriate. The debris aerodynamic characteristics are difficult to determine using traditional methods, such as static or dynamic test data, due to the scaling requirements of simulating a typical debris event. The use of CFD methods has been a critical element for building confidence in the accuracy of the debris transport code by bridging the gap between existing aerodynamic data and the dynamics of full-scale, in-flight events.
A Bayesian model for time-to-event data with informative censoring
Kaciroti, Niko A.; Raghunathan, Trivellore E.; Taylor, Jeremy M. G.; Julius, Stevo
2012-01-01
Randomized trials with dropouts or censored data and discrete time-to-event type outcomes are frequently analyzed using the Kaplan–Meier or product limit (PL) estimation method. However, the PL method assumes that the censoring mechanism is noninformative and when this assumption is violated, the inferences may not be valid. We propose an expanded PL method using a Bayesian framework to incorporate informative censoring mechanism and perform sensitivity analysis on estimates of the cumulative incidence curves. The expanded method uses a model, which can be viewed as a pattern mixture model, where odds for having an event during the follow-up interval (tk−1,tk], conditional on being at risk at tk−1, differ across the patterns of missing data. The sensitivity parameters relate the odds of an event, between subjects from a missing-data pattern with the observed subjects for each interval. The large number of the sensitivity parameters is reduced by considering them as random and assumed to follow a log-normal distribution with prespecified mean and variance. Then we vary the mean and variance to explore sensitivity of inferences. The missing at random (MAR) mechanism is a special case of the expanded model, thus allowing exploration of the sensitivity to inferences as departures from the inferences under the MAR assumption. The proposed approach is applied to data from the TRial Of Preventing HYpertension. PMID:22223746
ERIC Educational Resources Information Center
Hillbrand, Marc; Waite, Bradley M.
1992-01-01
Used Experience Sampling Method to investigate experiences of anger in 10 patients at maximum security forensic institute who had histories of severe, violent behavior. Found severity of anger influenced by type of activity in which subject was engaged and by emotional valence of preceding events but not by time of day nor by type of interpersonal…
A Multi-Faceted Analysis of a New Therapeutic Model of Linking Appraisals to Affective Experiences.
ERIC Educational Resources Information Center
McCarthy, Christopher; And Others
I. Roseman, M. Spindel, and P. Jose (1990) had previously demonstrated that specific appraisals of events led to discrete emotional responses, but this model has not been widely tested by other research teams using alternative research methods. The present study utilized four qualitative research methods, taught by Patti Lather at the 1994…
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Reece, Michael; Mark, Kristen; Herbenick, Debby; Hensel, Devon J; Jawed-Wessel, Sofia; Dodge, Brian
2012-03-01
Little is known about the characteristics of sexual events during which individuals choose to use lubricant with condoms. The aims of this article were to evaluate the determinants of adding lubricant to condoms during baseline and at the event level, to assess the event-level variables' influence on adding lubricants to condoms, and to assess the event-level influence of using condoms with lubricant on event-level condom attitudes. A total of 1,874 men completed a 30-day Internet-based prospective daily diary study of sexual behavior and condom use. Baseline data included demographic variables and information about condom education. Daily diary data included reports of penile-vaginal sex regarding intercourse duration, intercourse intensity, intoxication level, condom application method, partner contraceptive method, and partner and relationship characteristics. Lubricant was added to 24.3% of the study-provided condoms and 26.2% of the condoms selected by study participants. Those with more education and those who were married were more likely to add lubrication to condoms. Adding lubricant to condoms, a female partner putting the condom on with her hands and using contraception, and the event occurring with a wife (vs. girlfriend) was significantly associated with longer intercourse. Event-level lubricant and condom use significantly predicted lower willingness to buy the condom it was used with, as well as to recommend the condom. Adding exogenous lubricant was not related to the participants' confidence in condoms as a method to prevent pregnancy and sexually transmitted infections. The event-level nature of this study provided for a more comprehensive assessment of the situational factors that are associated with applying lubricant to condoms. Findings from this study suggest that men are adding lubricant to condoms for reasons other than to increase condom efficacy. © 2012 International Society for Sexual Medicine.
Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method
Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan
2015-07-29
Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less
Adverse Events in Affiliated Hospitals of Mazandaran University of Medical Sciences
Saravi, Benyamin Mohseni; Siamian, Hasan; Nezhad, Ayyob Barzegar; Asghari, Zoleleykha; Kabirzadeh, Azar
2014-01-01
Due to the complexity of the hospital environment, its structure faces with multiple hazards. The risks whether by providing the care and whether by hospital environment endanger patients, relatives and care providers. Therefore, a more accurate reporting and analysis of the report by focusing on access to preventative methods is essential. In this study, hospitals' adverse event that has sent by affiliated hospitals of Mazandaran University of Medical Sciences to deputy for treatment has studied. PMID:24944536
Li, Xiang; Pan, Liangwen; Li, Junyi; Zhang, Qigang; Zhang, Shuya; Lv, Rong; Yang, Litao
2011-12-28
For implementation of the issued regulations and labeling policies for genetically modified organism (GMO) supervision, the polymerase chain reaction (PCR) method has been widely used due to its high specificity and sensitivity. In particular, use of the event-specific PCR method based on the flanking sequence of transgenes has become the primary trend. In this study, both qualitative and quantitative PCR methods were established on the basis of the 5' flanking sequence of transgenic soybean A2704-12 and the 3' flanking sequence of transgenic soybean A5547-127, respectively. In qualitative PCR assays, the limits of detection (LODs) were 10 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127. In quantitative real-time PCR assays, the LODs were 5 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127, and the limits of quantification (LOQs) were 10 copies for both. Low bias and acceptable SD and RSD values were also achieved in quantification of four blind samples using the developed real-time PCR assays. In addition, the developed PCR assays for the two transgenic soybean events were used for routine analysis of soybean samples imported to Shanghai in a 6 month period from October 2010 to March 2011. A total of 27 lots of soybean from the United States and Argentina were analyzed: 8 lots from the Unites States were found to have the GM soybean A2704-12 event, and the GM contents were <1.5% in all eight analyzed lots. On the contrary, no GM soybean A5547-127 content was found in any of the eight lots. These results demonstrated that the established event-specific qualitative and quantitative PCR methods could be used effectively in routine identification and quantification of GM soybeans A2704-12 and A5547-127 and their derived products.
Ross, Joseph S; Bates, Jonathan; Parzynski, Craig S; Akar, Joseph G; Curtis, Jeptha P; Desai, Nihar R; Freeman, James V; Gamble, Ginger M; Kuntz, Richard; Li, Shu-Xia; Marinac-Dabic, Danica; Masoudi, Frederick A; Normand, Sharon-Lise T; Ranasinghe, Isuru; Shaw, Richard E; Krumholz, Harlan M
2017-01-01
Background Machine learning methods may complement traditional analytic methods for medical device surveillance. Methods and results Using data from the National Cardiovascular Data Registry for implantable cardioverter–defibrillators (ICDs) linked to Medicare administrative claims for longitudinal follow-up, we applied three statistical approaches to safety-signal detection for commonly used dual-chamber ICDs that used two propensity score (PS) models: one specified by subject-matter experts (PS-SME), and the other one by machine learning-based selection (PS-ML). The first approach used PS-SME and cumulative incidence (time-to-event), the second approach used PS-SME and cumulative risk (Data Extraction and Longitudinal Trend Analysis [DELTA]), and the third approach used PS-ML and cumulative risk (embedded feature selection). Safety-signal surveillance was conducted for eleven dual-chamber ICD models implanted at least 2,000 times over 3 years. Between 2006 and 2010, there were 71,948 Medicare fee-for-service beneficiaries who received dual-chamber ICDs. Cumulative device-specific unadjusted 3-year event rates varied for three surveyed safety signals: death from any cause, 12.8%–20.9%; nonfatal ICD-related adverse events, 19.3%–26.3%; and death from any cause or nonfatal ICD-related adverse event, 27.1%–37.6%. Agreement among safety signals detected/not detected between the time-to-event and DELTA approaches was 90.9% (360 of 396, k=0.068), between the time-to-event and embedded feature-selection approaches was 91.7% (363 of 396, k=−0.028), and between the DELTA and embedded feature selection approaches was 88.1% (349 of 396, k=−0.042). Conclusion Three statistical approaches, including one machine learning method, identified important safety signals, but without exact agreement. Ensemble methods may be needed to detect all safety signals for further evaluation during medical device surveillance. PMID:28860874
Root Cause Analysis Webinar: Q&A with Roni Silverstein. REL Mid-Atlantic Webinar
ERIC Educational Resources Information Center
Regional Educational Laboratory Mid-Atlantic, 2014
2014-01-01
Root cause analysis is a powerful method schools use to analyze data to solve problems; it aims to identify and correct the root causes of problems or events, rather than simply addressing their symptoms. In this webinar, veteran practitioner, Roni Silverstein, talked about the value of this process and practical ways to use it in your school or…
[Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].
de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena
2014-12-01
The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.
[Adverse events in patients from a pediatric hospital.
Ornelas-Aguirre, José Manuel; Arriaga-Dávila, José de Jesús; Domínguez-Serrano, María Isabel; Guzmán-Bihouet, Beatriz Filomena; Navarrete-Navarro, Susana
2013-01-01
Background: detection of adverse events is part of the safety management in hospitalized patients. The objective of this study was to describe the incidence of adverse events that occurred in a pediatric hospital. Methods: cross-sectional study of the adverse events occurred in a pediatric hospital from 2007 to 2009. Factors associated with their developmental causes were identified. The statistical analysis was descriptive and bivariate, with contingency tables to estimate the relationship between those factors. A p value = 0.05 was considered significant. Results: a total of 177 adverse events were registered. When they began, human factor occurred in 23 cases (13 %, OR = 1.41, p = 0.001), organizational factor was present in 71 cases (40 %, OR = 1.91, p = 0.236) and technical factor in 46 cases (26 %, OR = 0.87, p = 0.01). Blows or bruises from falls as a result of adverse events occurred in 71 cases (40 %, 95 % CI = 64-78). Conclusions: we found 1.84 events per 100 hospital discharges during the study period. The fall of patients ranked first of the adverse events identified.
Estimation and evaluation of COSMIC radio occultation excess phase using undifferenced measurements
NASA Astrophysics Data System (ADS)
Xia, Pengfei; Ye, Shirong; Jiang, Kecai; Chen, Dezhong
2017-05-01
In the GPS radio occultation technique, the atmospheric excess phase (AEP) can be used to derive the refractivity, which is an important quantity in numerical weather prediction. The AEP is conventionally estimated based on GPS double-difference or single-difference techniques. These two techniques, however, rely on the reference data in the data processing, increasing the complexity of computation. In this study, an undifferenced (ND) processing strategy is proposed to estimate the AEP. To begin with, we use PANDA (Positioning and Navigation Data Analyst) software to perform the precise orbit determination (POD) for the purpose of acquiring the position and velocity of the mass centre of the COSMIC (The Constellation Observing System for Meteorology, Ionosphere and Climate) satellites and the corresponding receiver clock offset. The bending angles, refractivity and dry temperature profiles are derived from the estimated AEP using Radio Occultation Processing Package (ROPP) software. The ND method is validated by the COSMIC products in typical rising and setting occultation events. Results indicate that rms (root mean square) errors of relative refractivity differences between undifferenced and atmospheric profiles (atmPrf) provided by UCAR/CDAAC (University Corporation for Atmospheric Research/COSMIC Data Analysis and Archive Centre) are better than 4 and 3 % in rising and setting occultation events respectively. In addition, we also compare the relative refractivity bias between ND-derived methods and atmPrf profiles of globally distributed 200 COSMIC occultation events on 12 December 2013. The statistical results indicate that the average rms relative refractivity deviation between ND-derived and COSMIC profiles is better than 2 % in the rising occultation event and better than 1.7 % in the setting occultation event. Moreover, the observed COSMIC refractivity profiles from ND processing strategy are further validated using European Centre for Medium-Range Weather Forecasts (ECMWF) analysis data, and the results indicate that the undifferenced method reduces the noise level on the excess phase paths in the lower troposphere compared to the single-difference processing strategy.
Winkens, Ieke; Ponds, Rudolf; Pouwels, Climmy; Eilander, Henk; van Heugten, Caroline
2014-01-01
The ABC method is a basic and simplified form of behavioural modification therapy for use by nurses. ABC refers to the identification of Antecedent events, target Behaviours, and Consequent events. A single-case experimental AB design was used to evaluate the effects of the ABC method on a woman diagnosed with olivo-ponto-cerebellar ataxia. Target behaviour was verbal aggressive behaviour during ADL care, assessed at 9 time points immediately before implementation of the ABC method and at 36 time points after implementation. A randomisation test showed a significant treatment effect between the baseline and intervention phases (t = .58, p = .03; ES [Nonoverlap All Pairs] = .62). Visual analysis, however, showed that the target behaviour was still present after implementation of the method and that on some days the nurses even judged the behaviour to be more severe than at baseline. Although the target behaviour was still present after treatment, the ABC method seems to be a promising tool for decreasing problem behaviour in patients with acquired brain injury. It is worth investigating the effects of this method in future studies. When interpreting single-subject data, both visual inspection and statistical analysis are needed to determine whether treatment is effective and whether the effects lead to clinically desirable results.
Duc, Anh Nguyen; Wolbers, Marcel
2017-02-10
Composite endpoints are widely used as primary endpoints of randomized controlled trials across clinical disciplines. A common critique of the conventional analysis of composite endpoints is that all disease events are weighted equally, whereas their clinical relevance may differ substantially. We address this by introducing a framework for the weighted analysis of composite endpoints and interpretable test statistics, which are applicable to both binary and time-to-event data. To cope with the difficulty of selecting an exact set of weights, we propose a method for constructing simultaneous confidence intervals and tests that asymptotically preserve the family-wise type I error in the strong sense across families of weights satisfying flexible inequality or order constraints based on the theory of χ¯2-distributions. We show that the method achieves the nominal simultaneous coverage rate with substantial efficiency gains over Scheffé's procedure in a simulation study and apply it to trials in cardiovascular disease and enteric fever. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Bizzarri, Anna Rita; Cannistraro, Salvatore
2014-08-22
Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.
NASA Astrophysics Data System (ADS)
Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor
2017-06-01
In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.
MIDAS: Software for the detection and analysis of lunar impact flashes
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús
2015-06-01
Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.
Leitão, Cristiane B; Gross, Jorge L
2017-01-01
Objective To evaluate the efficacy of coronary artery disease screening in asymptomatic patients with type 2 diabetes and assess the statistical reliability of the findings. Methods Electronic databases (MEDLINE, EMBASE, Cochrane Library and clinicaltrials.org) were reviewed up to July 2016. Randomised controlled trials evaluating coronary artery disease screening in asymptomatic patients with type 2 diabetes and reporting cardiovascular events and/or mortality were included. Data were summarised with Mantel-Haenszel relative risk. Trial sequential analysis (TSA) was used to evaluate the optimal sample size to detect a 40% reduction in outcomes. Main outcomes were all-cause mortality and cardiac events (non-fatal myocardial infarction and cardiovascular death); secondary outcomes were non-fatal myocardial infarction, myocardial revascularisations and heart failure. Results One hundred thirty-five references were identified and 5 studies fulfilled the inclusion criteria and totalised 3315 patients, 117 all-cause deaths and 100 cardiac events. Screening for coronary artery disease was not associated with decrease in risk for all-cause deaths (RR 0.95(95% CI 0.66 to 1.35)) or cardiac events (RR 0.72(95% CI 0.49 to 1.06)). TSA shows that futility boundaries were reached for all-cause mortality and a relative risk reduction of 40% between treatments could be discarded. However, there is not enough information for firm conclusions for cardiac events. For secondary outcomes no benefit or harm was identified; optimal sample sizes were not reached. Conclusion Current available data do not support screening for coronary artery disease in patients with type 2 diabetes for preventing fatal events. Further studies are needed to assess the effects on cardiac events. PROSPERO CRD42015026627. PMID:28490559