Science.gov

Sample records for event analysis atheana

  1. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    SciTech Connect

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  2. Human Events Reference for ATHEANA (HERA) Database Description and Preliminary User's Manual

    SciTech Connect

    Auflick, J.L.

    1999-08-12

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database (db) of analytical operational events, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  3. Human events reference for ATHEANA (HERA) database description and preliminary user`s manual

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Pond, D.J.

    1998-05-27

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  4. Discussion of Comments from a Peer Review of A Technique for Human Event Anlysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A,; Wreathall J.

    1999-01-28

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of draft NUREG- 1624, a peer review of the new human reliability analysis method its documentation and the results of an initial test of the method was held over a two-day period in June 1998 in Seattle, Washington. Four internationally known and respected experts in HK4 or probabilistic risk assessment were selected to serve as the peer reviewers. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. They were asked to focus on two major aspects: Are the basic premises of ATHEANA on solid ground and is the conceptual basis adequate? Is the ATHEANA implementation process adequate given the description of the intended users in the documentation? The four peer reviewers asked questions and provided oral comments during the peer review meeting and provided written comments approximately two weeks after the completion of the meeting. This paper discusses their major comments.

  5. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  6. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Whitehead, D.W.; Forester, J.A.; Bley, D.C.

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  7. Philosophy of ATHEANA

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A.; Thompson, C.M.; Whitehead, D.W.; Wreathall, J.

    1999-03-24

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  8. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  9. A process for application of ATHEANA - a new HRA method

    SciTech Connect

    Parry, G.W.; Bley, D.C.; Cooper, S.E.

    1996-10-01

    This paper describes the analytical process for the application of ATHEANA, a new approach to the performance of human reliability analysis as part of a PRA. This new method, unlike existing methods, is based upon an understanding of the reasons why people make errors, and was developed primarily to address the analysis of errors of commission.

  10. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    SciTech Connect

    Taylor, J.H.; Luckas, W.J.; Wreathall, J.

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  11. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    SciTech Connect

    Taylor, J.; O`Hara, J.; Luckas, W.

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  12. EVENT PLANNING USING FUNCTION ANALYSIS

    SciTech Connect

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  13. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  14. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  15. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  16. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  17. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  18. Top Event Matrix Analysis Code System.

    Energy Science and Technology Software Center (ESTSC)

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  19. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    , respectively. The clustering techniques prove to be much more effective for the New Mexico data than the Wyoming data, apparently because the New Mexico mines are closer and consequently the signal to noise ratios (SNR's) for those events are higher. To verify this hypothesis we experiment with adding gaussian noise to the New Mexico data to simulate data from more distant sites. Our results suggest that clustering techniques can be very useful for identifying small anomalous events if at least one good recording is available, and that the only reliable way to improve clustering results is to process the waveforms to improve SNR. For events with good SNR that do have strong grouping, cluster analysis will reveal the inherent groupings regardless of the choice of clustering method.

  20. Dynamic Event Tree Analysis Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  1. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  2. Automated analysis of failure event data

    SciTech Connect

    HENNESSY,COREY; FREERKS,FRED; CAMPBELL,JAMES E.; THOMPSON,BRUCE M.

    2000-03-27

    This paper focuses on fully automated analysis of failure event data in the concept and early development stage of a semiconductor-manufacturing tool. In addition to presenting a wide range of statistical and machine-specific performance information, algorithms have been developed to examine reliability growth and to identify major contributors to unreliability. These capabilities are being implemented in a new software package called Reliadigm. When coupled with additional input regarding repair times and parts availability, the analysis software also provides spare parts inventory optimization based on genetic optimization methods. The type of question to be answered is: If this tool were placed with a customer for beta testing, what would be the optimal spares kit to meet equipment reliability goals for the lowest cost? The new algorithms are implemented in Windows{reg_sign} software and are easy to apply. This paper presents a preliminary analysis of failure event data from three IDEA machines currently in development. The paper also includes an optimal spare parts kit analysis.

  3. Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.

  4. Analysis of a Limb Eruptive Event

    NASA Astrophysics Data System (ADS)

    Kotrč, P. Kupryakov, Yu. A.; Bárta, M.; Kashapova, K., L.; Liu, W.

    2016-04-01

    We present the analysis of an eruptive event that took place on the eastern limb on April 21, 2015, which was observed by the Ondřejov horizontal telescope and spectrograph. The eruption of the highly twisted prominence was followed by the onset of soft X-ray sources. We identified the structures observed in Hα spectra with the details on the Hα filtergrams and analyzed the evolution of Doppler component velocities. The timing and observed characteristics of the eruption were compared with the prediction of the model based on the twisting of the flux ropes and the kink/torus instability.

  5. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  7. Sisyphus - An Event Log Analysis Toolset

    Energy Science and Technology Software Center (ESTSC)

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less

  8. Analysis of a California Catalina eddy event

    NASA Technical Reports Server (NTRS)

    Bosart, L. F.

    1983-01-01

    During the period 26-29 May 1968 a shallow cyclonic circulation, known locally as a Catalina eddy, developed in the offshore waters of southern California. A synoptic and mesoscale analysis of the event establishes the following: (1) the incipient circulation forms on the coast near Santa Barbara downwind of the coastal mountains, (2) cyclonic shear vorticity appears offshore in response to lee troughing downstream of the coastal mountains between Vandenberg and Pt. Mugu, California, (3) mountain wave activity may be aiding incipient eddy formation in association with synoptic-scale subsidence and the generation of a stable layer near the crest of the coastal mountains, (4) a southeastward displacement and offshore expansion of the circulation occurs following the passage of the synoptic-scale ridge line, and (5) dissipation of the eddy occurs with the onset of a broad onshore flow.

  9. Sisyphus - An Event Log Analysis Toolset

    SciTech Connect

    Jon Stearley, Glenn Laguna

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiy understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.

  10. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  11. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  12. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  13. Modelling recurrent events: a tutorial for analysis in epidemiology

    PubMed Central

    Amorim, Leila DAF; Cai, Jianwen

    2015-01-01

    In many biomedical studies, the event of interest can occur more than once in a participant. These events are termed recurrent events. However, the majority of analyses focus only on time to the first event, ignoring the subsequent events. Several statistical models have been proposed for analysing multiple events. In this paper we explore and illustrate several modelling techniques for analysis of recurrent time-to-event data, including conditional models for multivariate survival data (AG, PWP-TT and PWP-GT), marginal means/rates models, frailty and multi-state models. We also provide a tutorial for analysing such type of data, with three widely used statistical software programmes. Different approaches and software are illustrated using data from a bladder cancer project and from a study on lower respiratory tract infection in children in Brazil. Finally, we make recommendations for modelling strategy selection for analysis of recurrent event data. PMID:25501468

  14. Modelling recurrent events: a tutorial for analysis in epidemiology.

    PubMed

    Amorim, Leila D A F; Cai, Jianwen

    2015-02-01

    In many biomedical studies, the event of interest can occur more than once in a participant. These events are termed recurrent events. However, the majority of analyses focus only on time to the first event, ignoring the subsequent events. Several statistical models have been proposed for analysing multiple events. In this paper we explore and illustrate several modelling techniques for analysis of recurrent time-to-event data, including conditional models for multivariate survival data (AG, PWP-TT and PWP-GT), marginal means/rates models, frailty and multi-state models. We also provide a tutorial for analysing such type of data, with three widely used statistical software programmes. Different approaches and software are illustrated using data from a bladder cancer project and from a study on lower respiratory tract infection in children in Brazil. Finally, we make recommendations for modelling strategy selection for analysis of recurrent event data. PMID:25501468

  15. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  16. The flood event explorer - a web based framework for rapid flood event analysis

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Lüdtke, Stefan; Kreibich, Heidi; Merz, Bruno

    2015-04-01

    Flood disaster management, recovery and reconstruction planning benefit from rapid evaluations of flood events and expected impacts. The near real time in-depth analysis of flood causes and key drivers for flood impacts requires a close monitoring and documentation of hydro-meteorological and socio-economic factors. Within the CEDIM's Rapid Flood Event Analysis project a flood event analysis system is developed which enables the near real-time evaluation of large scale floods in Germany. The analysis system includes functionalities to compile event related hydro-meteorological data, to evaluate the current flood situation, to assess hazard intensity and to estimate flood damage to residential buildings. A German flood event database is under development, which contains various hydro-meteorological information - in the future also impact information -for all large-scale floods since 1950. This data base comprises data on historic flood events which allow the classification of ongoing floods in terms of triggering processes and pre-conditions, critical controls and drivers for flood losses. The flood event analysis system has been implemented in a database system which automatically retrieves and stores data from more than 100 online discharge gauges on a daily basis. The current discharge observations are evaluated in a long term context in terms of flood frequency analysis. The web-based frontend visualizes the current flood situation in comparison to any past flood from the flood catalogue. The regional flood data base for Germany contains hydro-meteorological data and aggregated severity indices for a set of 76 historic large-scale flood events in Germany. This data base has been used to evaluate the key drivers for the flood in June 2013.

  17. Multiscale analysis of a sustained precipitation event

    NASA Technical Reports Server (NTRS)

    Knupp, Kevin R.; Williams, Steven F.

    1987-01-01

    Mesoscale data collected during both the satellite precipitation and cloud experiment and the microburst and severe thunderstorm program are analyzed in order to describe features associated with two distinct mesoscale precipitation events that occurred about 10 hours apart over the region of northern Alabama to central Tennessee in June 1986. Data sets used include mesobeta-scale rawinsonde data, surface mesonet data, RADAP data, and GOES images. The present mesoscale environment involved the merger of Hurricane Bonnie remnants with a preexisting midlatitude short-wave trough.

  18. [Analysis of Spontaneously Reported Adverse Events].

    PubMed

    Nakamura, Mitsuhiro

    2016-01-01

    Observational study is necessary for the evaluation of drug effectiveness in clinical practice. In recent years, the use of spontaneous reporting systems (SRS) for adverse drug reactions has increased and they have become an important resource for regulatory science. SRS, being the largest and most well-known databases worldwide, are one of the primary tools used for postmarketing surveillance and pharmacovigilance. To analyze SRS, the US Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report Database (JADER) are reviewed. Authorized pharmacovigilance algorithms were used for signal detection, including the reporting odds ratio. An SRS is a passive reporting database and is therefore subject to numerous sources of selection bias, including overreporting, underreporting, and a lack of a denominator. Despite the inherent limitations of spontaneous reporting, SRS databases are a rich resource and data mining index that provide powerful means of identifying potential associations between drugs and their adverse effects. Our results, which are based on the evaluation of SRS databases, provide essential knowledge that could improve our understanding of clinical issues. PMID:27040337

  19. Statistical Analysis of Small Ellerman Bomb Events

    NASA Astrophysics Data System (ADS)

    Nelson, C. J.; Doyle, J. G.; Erdélyi, R.; Huang, Z.; Madjarska, M. S.; Mathioudakis, M.; Mumford, S. J.; Reardon, K.

    2013-04-01

    The properties of Ellerman bombs (EBs), small-scale brightenings in the Hα line wings, have proved difficult to establish because their size is close to the spatial resolution of even the most advanced telescopes. Here, we aim to infer the size and lifetime of EBs using high-resolution data of an emerging active region collected using the Interferometric BIdimensional Spectrometer (IBIS) and Rapid Oscillations of the Solar Atmosphere (ROSA) instruments as well as the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO). We develop an algorithm to track EBs through their evolution, finding that EBs can often be much smaller (around 0.3″) and shorter-lived (less than one minute) than previous estimates. A correlation between G-band magnetic bright points and EBs is also found. Combining SDO/HMI and G-band data gives a good proxy of the polarity for the vertical magnetic field. It is found that EBs often occur both over regions of opposite polarity flux and strong unipolar fields, possibly hinting at magnetic reconnection as a driver of these events.The energetics of EB events is found to follow a power-law distribution in the range of a nanoflare (1022-25 ergs).

  20. Dynamic Modelling and Statistical Analysis of Event Times

    PubMed Central

    Peña, Edsel A.

    2006-01-01

    This review article provides an overview of recent work in the modelling and analysis of recurrent events arising in engineering, reliability, public health, biomedical, and other areas. Recurrent event modelling possesses unique facets making it different and more difficult to handle than single event settings. For instance, the impact of an increasing number of event occurrences needs to be taken into account, the effects of covariates should be considered, potential association among the inter-event times within a unit cannot be ignored, and the effects of performed interventions after each event occurrence need to be factored in. A recent general class of models for recurrent events which simultaneously accommodates these aspects is described. Statistical inference methods for this class of models are presented and illustrated through applications to real data sets. Some existing open research problems are described. PMID:17906740

  1. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  2. Peak event analysis: a novel empirical method for the evaluation of elevated particulate events

    PubMed Central

    2013-01-01

    Background We report on a novel approach to the analysis of suspended particulate data in a rural setting in southern Ontario. Analyses of suspended particulate matter and associated air quality standards have conventionally focussed on 24-hour mean levels of total suspended particulates (TSP) and particulate matter <10 microns, <2.5 microns and <1 micron in diameter (PM10, PM2.5, PM1, respectively). Less emphasis has been placed on brief peaks in suspended particulate levels, which may pose a substantial nuisance, irritant, or health hazard. These events may also represent a common cause of public complaint and concern regarding air quality. Methods Measurements of TSP, PM10, PM2.5, and PM1 levels were taken using an automated device following local complaints of dusty conditions in rural south-central Ontario, Canada. The data consisted of 126,051 by-minute TSP, PM10, PM2.5, and PM1 measurements between May and August 2012. Two analyses were performed and compared. First, conventional descriptive statistics were computed by month for TSP, PM10, PM2.5, and PM1, including mean values and percentiles (70th, 90th, and 95th). Second, a novel graphical analysis method, using density curves and line plots, was conducted to examine peak events occurring at or above the 99th percentile of per-minute TSP readings. We refer to this method as “peak event analysis”. Findings of the novel method were compared with findings from the conventional approach. Results Conventional analyses revealed that mean levels of all categories of suspended particulates and suspended particulate diameter ratios conformed to existing air quality standards. Our novel methodology revealed extreme outlier events above the 99th percentile of readings, with peak PM10 and TSP levels over 20 and 100 times higher than the respective mean values. Peak event analysis revealed and described rare and extreme peak dust events that would not have been detected using conventional descriptive statistics

  3. Event/Time/Availability/Reliability-Analysis Program

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas

    1994-01-01

    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  4. Event-Synchronous Analysis for Connected-Speech Recognition.

    NASA Astrophysics Data System (ADS)

    Morgan, David Peter

    The motivation for event-synchronous speech analysis originates from linear system theory where the speech-source transfer function is excited by an impulse-like driving function. In speech processing, the impulse response obtained from this linear system contains both semantic information and the vocal tract transfer function. Typically, an estimate of the transfer function is obtained via the spectrum by assuming a short-time stationary signal within some analysis window. However, this spectrum is often distorted by the periodic effects which occur when multiple (pitch) impulses are included in the analysis window. One method to remove these effects would be to deconvolve the excitation function from the speech signal to obtain the transfer function. The more attractive approach is to locate and identify the excitation function and synchronize the analysis frame with it. Event-synchronous analysis differs from pitch -synchronous analysis in that there are many events useful for speech recognition which are not pitch excited. In addition, event-synchronous analysis locates the important boundaries between speech events, such as voiced to unvoiced and silence to burst transitions. In asynchronous processing, an analysis frame which contains portions of two adjacent but dissimilar speech events is often so ambiguous as to distort or mask the important "phonetic" features of both events. Thus event-syncronous processing is employed to obtain an accurate spectral estimate and in turn enhance the estimate of the vocal-tract transfer function. Among the issues which have been addressed in implementing an event-synchronous recognition system are those of developing robust event (pitch, burst, etc.) detectors, synchronous-analysis methodologies, more meaningful feature sets, and dynamic programming algorithms for nonlinear time alignment. An advantage of event-synchronous processing is that the improved representation of the transfer function creates an opportunity for

  5. Statistical issues in the analysis of adverse events in time-to-event data.

    PubMed

    Allignol, Arthur; Beyersmann, Jan; Schmoor, Claudia

    2016-07-01

    The aim of this work is to shed some light on common issues in the statistical analysis of adverse events (AEs) in clinical trials, when the main outcome is a time-to-event endpoint. To begin, we show that AEs are always subject to competing risks. That is, the occurrence of a certain AE may be precluded by occurrence of the main time-to-event outcome or by occurrence of another (fatal) AE. This has raised concerns on 'informative' censoring. We show that, in general, neither simple proportions nor Kaplan-Meier estimates of AE occurrence should be used, but common survival techniques for hazards that censor the competing event are still valid, but incomplete analyses. They must be complemented by an analogous analysis of the competing event for inference on the cumulative AE probability. The commonly used incidence rate (or incidence density) is a valid estimator of the AE hazard assuming it to be time constant. An estimator of the cumulative AE probability can be derived if the incidence rate of AE is combined with an estimator of the competing hazard. We discuss less restrictive analyses using non-parametric and semi-parametric approaches. We first consider time-to-first-AE analyses and then briefly discuss how they can be extended to the analysis of recurrent AEs. We will give a practical presentation with illustration of the methods by a simple example. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26929180

  6. Event-based Recession Analysis across Scales

    NASA Astrophysics Data System (ADS)

    Chen, B.; Krajewski, W. F.

    2012-12-01

    Hydrograph recessions have long been a window to investigate hydrological processes and their interactions. The authors conducted an exploratory analysis of about 1000 individual hydrograph recessions in a period of around 15 years (1995-2010) from time series of hourly discharge (USGS IDA stream flow data set) at 27 USGS gauges located in Iowa and Cedar River basins with drainage area ranging from 6.7 to around 17000 km2. They calculated recession exponents with the same recession length but different time lags from the hydrograph peak ranging from ~0 to 96 hours, and then plotted them against time lags to construct the evolution of recession exponent. The result shows that, as recession continues, the recession exponent in first increases quickly, then decreases quickly, and finally stays constant. Occasionally and for different reasons, the decreasing portion is missing due to negligible contribution from soil water storage. The increasing part of the evolution of can be related to fast response to rainfall including overland flow and quick subsurface flow through macropores (or tiles), and the decreasing portion can be connected to the delayed soil water response. Lastly, the constant segment can be attributed to the groundwater storage with the slowest response. The points where recession exponent reaches its maximum and begins to plateau are the times that fast response and soil water response end, respectively. The authors conducted further theoretical analysis by combining mathematical derivation and literature results to explain the observed evolution path of the recession exponent . Their results have a direct application in hydrograph separation and important implications for dynamic basin storage-discharge relation analysis and hydrological process understanding across scales.

  7. Event shape analysis of deep inelastic scattering events with a large rapidity gap at HERA

    NASA Astrophysics Data System (ADS)

    ZEUS Collaboration; Breitweg, J.; Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Yoshida, R.; Zhang, H.; Mattingly, M. C. K.; Anselmo, F.; Antonioli, P.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Cara Romeo, G.; Castellini, G.; Cifarelli, L.; Cindolo, F.; Contin, A.; Corradi, M.; de Pasquale, S.; Gialas, I.; Giusti, P.; Iacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Palmonari, F.; Pesci, A.; Polini, A.; Ricci, F.; Sartorelli, G.; Zamora Garcia, Y.; Zichichi, A.; Amelung, C.; Bornheim, A.; Brock, I.; Coböken, K.; Crittenden, J.; Deffner, R.; Eckert, M.; Grothe, M.; Hartmann, H.; Heinloth, K.; Heinz, L.; Hilger, E.; Jakob, H.-P.; Katz, U. F.; Kerger, R.; Paul, E.; Pfeiffer, M.; Rembser, Ch.; Stamm, J.; Wedemeyer, R.; Wieber, H.; Bailey, D. S.; Campbell-Robson, S.; Cottingham, W. N.; Foster, B.; Hall-Wilton, R.; Hayes, M. E.; Heath, G. P.; Heath, H. F.; McFall, J. D.; Piccioni, D.; Roff, D. G.; Tapper, R. J.; Arneodo, M.; Ayad, R.; Capua, M.; Garfagnini, A.; Iannotti, L.; Schioppa, M.; Susinno, G.; Kim, J. Y.; Lee, J. H.; Lim, I. T.; Pac, M. Y.; Caldwell, A.; Cartiglia, N.; Jing, Z.; Liu, W.; Mellado, B.; Parsons, J. A.; Ritz, S.; Sampson, S.; Sciulli, F.; Straub, P. B.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Figiel, J.; Klimek, K.; Przybycień , M. B.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Bukowy, M.; Jeleń , K.; Kisielewska, D.; Kowalski, T.; Przybycień , M.; Rulikowska-Zarȩ Bska, E.; Suszycki, L.; Zaja C, J.; Duliń Ski, Z.; Kotań Ski, A.; Abbiendi, G.; Bauerdick, L. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Cases, G.; Deppe, O.; Desler, K.; Drews, G.; Fricke, U.; Gilkinson, D. J.; Glasman, C.; Göttlicher, P.; Haas, T.; Hain, W.; Hasell, D.; Johnson, K. F.; Kasemann, M.; Koch, W.; Kötz, U.; Kowalski, H.; Labs, J.; Lindemann, L.; Löhr, B.; Löwe, M.; Mań Czak, O.; Milewski, J.; Monteiro, T.; Ng, J. S. T.; Notz, D.; Ohrenberg, K.; Park, I. H.; Pellegrino, A.; Pelucchi, F.; Piotrzkowski, K.; Roco, M.; Rohde, M.; Roldán, J.; Ryan, J. J.; Savin, A. A.; Schneekloth, U.; Selonke, F.; Surrow, B.; Tassi, E.; Voß, T.; Westphal, D.; Wolf, G.; Wollmer, U.; Youngman, C.; Zsolararnecki, A. F.; Zeuner, W.; Burow, B. D.; Grabosch, H. J.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Gallo, E.; Pelfer, P.; Maccarrone, G.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Markun, P.; Trefzger, T.; Wölfle, S.; Bromley, J. T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; MacDonald, N.; Saxon, D. H.; Sinclair, L. E.; Strickland, E.; Waugh, R.; Bohnet, I.; Gendner, N.; Holm, U.; Meyer-Larsen, A.; Salehi, H.; Wick, K.; Gladilin, L. K.; Horstmann, D.; Kçira, D.; Klanner, R.; Lohrmann, E.; Poelz, G.; Schott, W.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Cole, J. E.; Howell, G.; Hung, B. H. Y.; Lamberti, L.; Long, K. R.; Miller, D. B.; Pavel, N.; Prinias, A.; Sedgbeer, J. K.; Sideris, D.; Walker, R.; Mallik, U.; Wang, S. M.; Wu, J. T.; Cloth, P.; Filges, D.; Fleck, J. I.; Ishii, T.; Kuze, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamauchi, K.; Yamazaki, Y.; Hong, S. J.; Lee, S. B.; Nam, S. W.; Park, S. K.; Barreiro, F.; Fernández, J. P.; García, G.; Graciani, R.; Hernández, J. M.; Hervás, L.; Labarga, L.; Martínez, M.; del Peso, J.; Puga, J.; Terrón, J.; de Trocóniz, J. F.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Murray, W. N.; Ochs, A.; Riveline, M.; Stairs, D. G.; St-Laurent, M.; Ullmann, R.; Tsurugai, T.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Golubkov, Yu. A.; Khein, L. A.; Korotkova, N. A.; Korzhavina, I. A.; Kuzmin, V. A.; Lukina, O. Yu.; Proskuryakov, A. S.; Shcheglova, L. M.; Solomin, A. N.; Zotkin, S. A.; Bokel, C.; Botje, M.; Brümmer, N.; Chlebana, F.; Engelen, J.; Koffeman, E.; Kooijman, P.; van Sighem, A.; Tiecke, H.; Tuning, N.; Verkerke, W.; Vossebeld, J.; Vreeswijk, M.; Wiggers, L.; de Wolf, E.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Gilmore, J.; Ginsburg, C. M.; Kim, C. L.; Ling, T. Y.; Nylander, P.; Romanowski, T. A.; Blaikley, H. E.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Devenish, R. C. E.; Edmonds, J. K.; Große-Knetter, J.; Harnew, N.; Nath, C.; Noyes, V. A.; Quadt, A.; Ruske, O.; Tickner, J. R.; Uijterwaal, H.; Walczak, R.; Waters, D. S.; Bertolin, A.; Brugnera, R.; Carlin, R.; dal Corso, F.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Oh, B. Y.; Okrasiń Ski, J. R.; Toothacker, W. S.; Whitmore, J. J.; Iga, Y.; D'Agostini, G.; Marini, G.; Nigro, A.; Raso, M.; Hart, J. C.; McCubbin, N. A.; Shah, T. P.; Epperson, D.; Heusch, C.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Wichmann, R.; Williams, D. C.; Schwarzer, O.; Walenta, A. H.; Abramowicz, H.; Briskin, G.; Dagan, S.; Kananov, S.; Levy, A.; Abe, T.; Fusayasu, T.; Inuzuka, M.; Nagano, K.; Umemori, K.; Yamashita, T.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Matsushita, T.; Cirio, R.; Costa, M.; Ferrero, M. I.; Maselli, S.; Monaco, V.; Peroni, C.; Petrucci, M. C.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Fagerstroem, C.-P.; Galea, R.; Hartner, G. F.; Joo, K. K.; Levman, G. M.; Martin, J. F.; Orr, R. S.; Polenz, S.; Sabetfakhri, A.; Simmons, D.; Teuscher, R. J.; Butterworth, J. M.; Catterall, C. D.; Jones, T. W.; Lane, J. B.; Saunders, R. L.; Sutton, M. R.; Wing, M.; Ciborowski, J.; Grzelak, G.; Kasprzak, M.; Muchorowski, K.; Nowak, R. J.; Pawlak, J. M.; Pawlak, R.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Adamus, M.; Coldewey, C.; Eisenberg, Y.; Hochman, D.; Karshon, U.; Badgett, W. F.; Chapin, D.; Cross, R.; Dasu, S.; Foudas, C.; Loveless, R. J.; Mattingly, S.; Reeder, D. D.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Deshpande, A.; Dhawan, S.; Hughes, V. W.; Bhadra, S.; Frisken, W. R.; Khakzad, M.; Schmidke, W. B.

    1998-03-01

    A global event shape analysis of the multihadronic final states observed in neutral current deep inelastic scattering events with a large rapidity gap with respect to the proton direction is presented. The analysis is performed in the range 5<=Q2<=185 GeV2 and 160<=W<=250 GeV, where Q2 is the virtuality of the photon and W is the virtual-photon proton centre of mass energy. Particular emphasis is placed on the dependence of the shape variables, measured in the γ*-pomeron rest frame, on the mass of the hadronic final state, MX. With increasing MX the multihadronic final state becomes more collimated and planar. The experimental results are compared with several models which attempt to describe diffractive events. The broadening effects exhibited by the data require in these models a significant gluon component of the pomeron.

  8. Glaciological parameters of disruptive event analysis

    SciTech Connect

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated.

  9. A reference manual for the Event Progression Analysis Code (EVNTRE)

    SciTech Connect

    Griesmeyer, J.M.; Smith, L.N.

    1989-09-01

    This document is a reference guide for the Event Progression Analysis (EVNTRE) code developed at Sandia National Laboratories. EVNTRE is designed to process the large accident progression event trees and associated files used in probabilistic risk analyses for nuclear power plants. However, the general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. The EVNTRE code efficiently processes large, complex event trees. It has the capability to assign probabilities to event tree branch points in several different ways, to classify pathways or outcomes into user-specified groupings, and to sample input distributions of probabilities and parameters.

  10. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  11. Nonlinear Analysis for Event Forewarning (NLAfEW)

    Energy Science and Technology Software Center (ESTSC)

    2013-05-23

    The NLAfEW computer code analyses noisy, experimental data to forewarn of adverse events. The functionality of the analysis is a follows: It removes artifacts from the data, converts the continuous data value to discrete values, constructs time-delay embedding vectors, comparents the unique nodes and links in one graph, and determines event forewarning on the basis of several successive occurrences of one (or more) of the dissimilarity measures above a threshold.

  12. Life Events and Psychosis: A Review and Meta-analysis

    PubMed Central

    Beards, Stephanie; Fisher, Helen L.; Morgan, Craig

    2013-01-01

    Introduction:Recent models of psychosis implicate stressful events in its etiology. However, while evidence has accumulated for childhood trauma, the role of adult life events has received less attention. Therefore, a review of the existing literature on the relationship between life events and onset of psychotic disorder/experiences is timely. Methods: A search was conducted using PsychInfo, Medline, Embase, and Web of Science to identify studies of life events and the onset of psychosis or psychotic experiences within the general population. Given previous methodological concerns, this review included a novel quality assessment tool and focused on findings from the most robust studies. A meta-analysis was performed on a subgroup of 13 studies. Results: Sixteen studies published between 1968 and 2012 were included. Of these, 14 reported positive associations between exposure to adult life events and subsequent onset of psychotic disorder/experiences. The meta-analysis yielded an overall weighted OR of 3.19 (95% CI 2.15–4.75). However, many studies were limited by small sample sizes and the use of checklist measures of life events, with no consideration of contextual influences on the meaning and interpretation of events. Conclusions: Few studies have assessed the role of adult life events in the onset of psychosis. There was some evidence that reported exposure to adult life events was associated with increased risk of psychotic disorder and subclinical psychotic experiences. However, the methodological quality of the majority of studies was low, which urges caution in interpreting the results and points toward a need for more methodologically robust studies. PMID:23671196

  13. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Bouilloud, Ludovic; Delrieu, Guy; Boudevillain, Brice; Kirstetter, Pierre-Emmanuel

    2010-11-01

    SummaryA method to estimate rainfall from radar data for post-event analysis of flash-flood events has been developed within the EC-funded HYDRATE project. It follows a pragmatic approach including careful analysis of the observation conditions for the radar system(s) available for the considered case. Clutter and beam blockage are characterised by dry-weather observations and simulations based on a digital terrain model of the region of interest. The vertical profile of reflectivity (VPR) is either inferred from radar data if volume scanning data are available or simply defined using basic meteorological parameters (idealised VPR). Such information is then used to produce correction factor maps for each elevation angle to correct for range-dependent errors. In a second step, an effective Z-R relationship is optimised to remove the bias over the hit region. Due to limited data availability, the optimisation is carried out with reference to raingauge rain amounts measured at the event time scale. Sensitivity tests performed with two well-documented rain events show that a number of Z = aRb relationships, organised along hyperbolic curves in the (a and b) parameter space, lead to optimum assessment results in terms of the Nash coefficient between the radar and raingauge estimates. A refined analysis of these equifinality patterns shows that the “total additive conditional bias” can be used to discriminate between the Nash coefficient equifinal solutions. We observe that the optimisation results are sensitive to the VPR description and also that the Z-R optimisation procedure can largely compensate for range-dependent errors, although this shifts the optimal coefficients in the parameter space. The time-scale dependency of the equifinality patterns is significant, however near-optimal Z-R relationships can be obtained at all time scales from the event time step optimisation.

  14. Case-cohort analysis of clusters of recurrent events.

    PubMed

    Chen, Feng; Chen, Kani

    2014-01-01

    The case-cohort sampling, first proposed in Prentice (Biometrika 73:1-11, 1986), is one of the most effective cohort designs for analysis of event occurrence, with the regression model being the typical Cox proportional hazards model. This paper extends to consider the case-cohort design for recurrent events with certain specific clustering feature, which is captured by a properly modified Cox-type self-exciting intensity model. We discuss the advantage of using this model and validate the pseudo-likelihood method. Simulation studies are presented in support of the theory. Application is illustrated with analysis of a bladder cancer data. PMID:23832308

  15. External events analysis for the Savannah River Site K reactor

    SciTech Connect

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{sup {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.

  16. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  17. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  18. Heterogeneity and event dependence in the analysis of sickness absence

    PubMed Central

    2013-01-01

    Background Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. Methods To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. Results As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. Conclusions The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events, especially with large databases. PMID:24040880

  19. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied

  20. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  1. Offense Specialization of Arrestees: An Event History Analysis

    ERIC Educational Resources Information Center

    Lo, Celia C.; Kim, Young S.; Cheng, Tyrone C.

    2008-01-01

    The data set employed in the present study came from interviews with arrestees conducted between 1999 and 2001 as well as from their official arrest records obtained from jail administrators. A total of 238 arrestees ages 18 to 25 constituted the final sample. Event history analysis examined each arrestee's movement from periods of no arrests to…

  2. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  3. Analysis of Suprathermal Events Observed by STEREO/PLASTIC

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Farrugia, C. J.; Popecki, M.; Klecker, B.; Ellis, L.; Lee, M. A.; Kistler, L. M.; Luhmann, J. G.; Russell, C. T.; Simunac, K.; Kucharek, H.; Blush, L.; Bochsler, P.; Möbius, E.; Thompson, B. J.; Wimmer-Schweingruber, R.; Wurz, P.

    2008-12-01

    Since the late 1960's, suprathermal and energetic ion events with energies ranging from just above the solar wind energies up to 2MeV and lasting for several minutes to hours, have been detected upstream of the Earth. Possible sources of these ions include magnetospheric ions, solar wind ions accelerated between the Earth's bow shock and hydromagnetic waves to energies just above the solar wind energies, and remnant ions from heliospheric processes (such as Solar Energetic Particle (SEP) events or Corotating Interaction Regions (CIRs)). The unique orbits of both STEREO spacecraft, STEREO-A (STA) drifting ahead in Earth's orbit and STEREO-B (STB) lagging behind in Earth's orbit, allow for analysis of upstream events in these unexamined regions. Using both the PLASTIC and IMPACT instruments on board STA/B we can examine protons in the energy range of solar wind energies up to 80keV, their spatial distribution, and determine if the spacecraft is magnetically connected to the Earth's bow shock. Suprathermal events observed by STEREO/PLASTIC during solar minimum conditions are examined for possible upstream events using anisotropy measurements, velocity dispersion, magnetic connection to the bow shock, and frequency of events as a function of time and distance.

  4. Analysis of recurrent event data with incomplete observation gaps.

    PubMed

    Kim, Yang-Jin; Jhun, Myoungshic

    2008-03-30

    In analysis of recurrent event data, recurrent events are not completely experienced when the terminating event occurs before the end of a study. To make valid inference of recurrent events, several methods have been suggested for accommodating the terminating event (Statist. Med. 1997; 16:911-924; Biometrics 2000; 56:554-562). In this paper, our interest is to consider a particular situation, where intermittent dropouts result in observation gaps during which no recurrent events are observed. In this situation, risk status varies over time and the usual definition of risk variable is not applicable. In particular, we consider the case when information on the observation gap is incomplete, that is, the starting time of intermittent dropout is known but the terminating time is not available. This incomplete information is modeled in terms of an interval-censored mechanism. Our proposed method is applied to the study of the Young Traffic Offenders Program on conviction rates, wherein a certain proportion of subjects experienced suspensions with intermittent dropouts during the study. PMID:17611955

  5. Root Cause Analysis: Learning from Adverse Safety Events.

    PubMed

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. PMID:26466177

  6. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    SciTech Connect

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-11-22

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility.

  7. Time-quefrency analysis of overlapping similar microseismic events

    NASA Astrophysics Data System (ADS)

    Nagano, Koji

    2016-05-01

    In this paper, I describe a new technique to determine the interval between P-waves in similar, overlapping microseismic events. The similar microseismic events that occur with overlapping waveforms are called `proximate microseismic doublets' herein. Proximate microseismic doublets had been discarded in previous studies because we had not noticed their usefulness. Analysis of similar events can show relative locations of sources between them. Analysis of proximate microseismic doublets can provide more precise relative source locations because variation in the velocity structure has little influence on their relative travel times. It is necessary to measure the interval between the P-waves in the proximate microseismic doublets to determine their relative source locations. A `proximate microseismic doublet' is a pair of microseismic events in which the second event arrives before the attenuation of the first event. Cepstrum analysis can provide the interval even though the second event overlaps the first event. However, a cepstrum of a proximate microseismic doublet generally has two peaks, one representing the interval between the arrivals of the two P-waves, and the other representing the interval between the arrivals of the two S-waves. It is therefore difficult to determine the peak that represents the P-wave interval from the cepstrum alone. I used window functions in cepstrum analysis to isolate the first and second P-waves and to suppress the second S-wave. I change the length of the window function and calculate the cepstrum for each window length. The result is represented in a three-dimensional contour plot of length-quefrency-cepstrum data. The contour plot allows me to identify the cepstrum peak that represents the P-wave interval. The precise quefrency can be determined from a two-dimensional quefrency-cepstrum graph, provided that the length of the window is appropriately chosen. I have used both synthetic and field data to demonstrate that this

  8. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  9. Using discriminant analysis as a nucleation event classification method

    NASA Astrophysics Data System (ADS)

    Mikkonen, S.; Lehtinen, K. E. J.; Hamed, A.; Joutsensaari, J.; Facchini, M. C.; Laaksonen, A.

    2006-09-01

    More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  10. Using discriminant analysis as a nucleation event classification method

    NASA Astrophysics Data System (ADS)

    Mikkonen, S.; Lehtinen, K. E. J.; Hamed, A.; Joutsensaari, J.; Facchini, M. C.; Laaksonen, A.

    2006-12-01

    More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  11. The new event analysis of the Fermi large area telescope

    NASA Astrophysics Data System (ADS)

    Sgrò, Carmelo

    2014-07-01

    Since its launch on June 11, 2008 the Fermi Large Area Telescope (LAT) has been exploring the gamma-ray sky at energies from 20 MeV to over 300 GeV. Five years of nearly flawless operation allowed a constant improvement of the detector knowledge and, as a consequence, continuous update of the event selection and the corresponding instrument response parametrization. The final product of this effort is a radical revision of the entire event-level analysis, from the event reconstruction algorithms in each subsystem to the background rejection strategy. The potential improvements include a larger acceptance coupled with a significant reduction in background contamination, better angular and energy resolution and an extension of the energy reach below 100 MeV and in the TeV range. In this paper I will describe the new reconstruction and the event-level analysis, show the expected instrument performance and discuss future prospects for astro-particle physics with the LAT.

  12. An analysis of selected atmospheric icing events on test cables

    SciTech Connect

    Druez, J.; McComber, P.; Laflamme, J.

    1996-12-01

    In cold countries, the design of transmission lines and communication networks requires the knowledge of ice loads on conductors. Atmospheric icing is a stochastic phenomenon and therefore probabilistic design is used more and more for structure icing analysis. For strength and reliability assessments, a data base on atmospheric icing is needed to characterize the distributions of ice load and corresponding meteorological parameters. A test site where icing is frequent is used to obtain field data on atmospheric icing. This test site is located on the Mt. Valin, near Chicoutimi, Quebec, Canada. The experimental installation is mainly composed of various instrumented but non-energized test cables, meteorological instruments, a data acquisition system, and a video recorder. Several types of icing events can produce large ice accretions dangerous for land-based structures. They are rime due to in-cloud icing, glaze caused by freezing rain, wet snow, and mixtures of these types of ice. These icing events have very different characteristics and must be distinguished, before statistical analysis, in a data base on atmospheric icing. This is done by comparison of data from a precipitation gauge, an icing rate meter and a temperature sensor. An analysis of selected icing periods recorded on the cables of two perpendicular test lines during the 1992--1993 winter season is presented. Only significant icing events have been considered. A comparative analysis of the ice load on the four test cables is drawn from the data, and typical accretion and shedding parameters are calculated separately for icing events related to in-cloud icing and precipitation icing.

  13. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  14. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  15. Topological Analysis of Emerging Bipole Clusters Producing Violent Solar Events

    NASA Astrophysics Data System (ADS)

    Mandrini, C. H.; Schmieder, B.; Démoulin, P.; Guo, Y.; Cristiani, G. D.

    2014-06-01

    During the rising phase of Solar Cycle 24 tremendous activity occurred on the Sun with rapid and compact emergence of magnetic flux leading to bursts of flares (C to M and even X-class). We investigate the violent events occurring in the cluster of two active regions (ARs), NOAA numbers 11121 and 11123, observed in November 2010 with instruments onboard the Solar Dynamics Observatory and from Earth. Within one day the total magnetic flux increased by 70 % with the emergence of new groups of bipoles in AR 11123. From all the events on 11 November, we study, in particular, the ones starting at around 07:16 UT in GOES soft X-ray data and the brightenings preceding them. A magnetic-field topological analysis indicates the presence of null points, associated separatrices, and quasi-separatrix layers (QSLs) where magnetic reconnection is prone to occur. The presence of null points is confirmed by a linear and a non-linear force-free magnetic-field model. Their locations and general characteristics are similar in both modelling approaches, which supports their robustness. However, in order to explain the full extension of the analysed event brightenings, which are not restricted to the photospheric traces of the null separatrices, we compute the locations of QSLs. Based on this more complete topological analysis, we propose a scenario to explain the origin of a low-energy event preceding a filament eruption, which is accompanied by a two-ribbon flare, and a consecutive confined flare in AR 11123. The results of our topology computation can also explain the locations of flare ribbons in two other events, one preceding and one following the ones at 07:16 UT. Finally, this study provides further examples where flare-ribbon locations can be explained when compared to QSLs and only, partially, when using separatrices.

  16. Analysis of large Danube flood events at Vienna since 1700

    NASA Astrophysics Data System (ADS)

    Kiss, Andrea; Blöschl, Günter; Hohensinner, Severin; Perdigao, Rui

    2014-05-01

    Whereas Danube water level measurements are available in Vienna from 1820 onwards, documentary evidence plays a significant role in the long-term understanding of Danube hydrological processes. Based on contemporary documentary evidence and early instrumental measurements, in the present paper we aim to provide an overview and a hydrological analysis of major Danube flood events, and the changes occurred in flood behaviour in Vienna in the last 300 years. Historical flood events are discussed and analysed according to types, seasonality, frequency and magnitude. Concerning historical flood events we apply a classification of five-scaled indices that considers height, magnitude, length and impacts. The rich data coverage in Vienna, both in terms of documentary evidence and early instrumental measurements, provide us with the possibility to create a relatively long overlap between documentary evidence and instrumental measurements. This makes possible to evaluate and, to some extent, improve the index reconstruction. While detecting causes of changes in flood regime, we aim to provide an overview on the atmospheric background through some characteristic examples, selected great flood events (e.g. 1787). Moreover, we also seek for the answer for such questions as in what way early (pre-instrumental period) human impact such as water regulations and urban development changed flood behaviour in the town, and how much it might have an impact on flood classification.

  17. Mixed-effects Poisson regression analysis of adverse event reports

    PubMed Central

    Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

    2008-01-01

    SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  18. Empirical Green's function analysis of recent moderate events in California

    USGS Publications Warehouse

    Hough, S.E.

    2001-01-01

    I use seismic data from portable digital stations and the broadband Terrascope network in southern California to investigate radiated earthquake source spectra and discuss the results in light of previous studies on both static stress drop and apparent stress. Applying the empirical Green's function (EGF) method to two sets of M 4-6.1 events, I obtain deconvolved source-spectra estimates and corner frequencies. The results are consistent with an ??2 source model and constant Brune stress drop. However, consideration of the raw spectral shapes of the largest events provides evidence for a high-frequency decay more shallow than ??2. The intermediate (???f-1) slope cannot be explained plausibly with attenuation or site effects and is qualitatively consistent with a model incorporating directivity effects and a fractional stress-drop rupture process, as suggested by Haddon (1996). However, the results obtained in this study are not consistent with the model of Haddon (1996) in that the intermediate slope is not revealed with EGF analysis. This could reflect either bandwidth limitations inherent in EGF analysis or perhaps a rupture process that is not self-similar. I show that a model with an intermediate spectral decay can also reconcile the apparent discrepancy between the scaling of static stress drop and that of apparent stress drop for moderate-to-large events.

  19. A Dendrochronological Analysis of Mississippi River Flood Events

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  20. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  1. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  4. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  5. An analysis of three nuclear events in P-Tunnel

    SciTech Connect

    Fourney, W.L.; Dick, R.D.; Taylor, S.R.; Weaver, T.A.

    1994-05-03

    This report examines experimental results obtained from three P Tunnel events -- Mission Cyber, Disko Elm, and Distant Zenith. The objective of the study was to determine if there were any differences in the explosive source coupling for the three events. It was felt that Mission Cyber might not have coupled well because the ground motions recorded for that event were much lower than expected based on experience from N Tunnel. Detailed examination of the physical and chemical properties of the tuff in the vicinity of each explosion indicated only minor differences. In general, the core samples are strong and competent out to at least 60 m from each working point. Qualitative measures of core sample strength indicate that the strength of the tuff near Mission Cyber may be greater than indicated by results of static testing. Slight differences in mineralogic content and saturation of the Mission Cyber tuff were noted relative to the other two tests, but probably would not result in large differences in ground motions. Examination of scaled free-field stress and acceleration records collected by Sandia National Laboratory (SNL) indicated that Disko Elm showed the least scatter and Distant Zenith the most scatter. Mission Cyber measurements tend to lie slightly below those of Distant Zenith, but still within two standard deviations. Analysis of regional seismic data from networks operated by Lawrence Livermore National Laboratory (LLNL) and SNL also show no evidence of Mission Cyber coupling low relative to the other two events. The overall conclusion drawn from the study is that there were no basic differences in the way that the explosions coupled to the rock.

  6. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  7. Mining for adverse drug events with formal concept analysis.

    PubMed

    Estacio-Moreno, Alexander; Toussaint, Yannick; Bousquet, Cédric

    2008-01-01

    The pharmacovigilance databases consist of several case reports involving drugs and adverse events (AEs). Some methods are applied consistently to highlight all signals, i.e. all statistically significant associations between a drug and an AE. These methods are appropriate for verification of more complex relationships involving one or several drug(s) and AE(s) (e.g; syndromes or interactions) but do not address the identification of them. We propose a method for the extraction of these relationships based on Formal Concept Analysis (FCA) associated with disproportionality measures. This method identifies all sets of drugs and AEs which are potential signals, syndromes or interactions. Compared to a previous experience of disproportionality analysis without FCA, the addition of FCA was more efficient for identifying false positives related to concomitant drugs. PMID:18487830

  8. Analysis of warm convective rain events in Catalonia

    NASA Astrophysics Data System (ADS)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  9. CDAW 9 analysis of magnetospheric events on May 3, 1986: Event C

    SciTech Connect

    Baker, D.N. ); Pulkkinen, T.I. Finnshish Meteorological Inst., Helsinki ); McPherron, R.L. Univ. of California, Los Angeles ); Craven, J.D ); Frank, L.A. ); Elphinstone, R.D.; Murphree, J.S. ); Fennell, J.F. ); Lopez, R.E. ); Nagai, T. )

    1993-03-01

    The ninth Coordinated Data Analysis Workshop focused upon several intervals within the PROMIS period. Event interval C comprised the period 0000-1200 UT on May 3, 1986, which was a highly distrubed time near the end of a geomagnetic storm interval. A very large substorm early in the period commenced at 0111 UT and had a peak AE index value of [approximately]1500 nT. Subsequent activity was lower, but at least three other substorms occurred at 2-3 hour intervals. The substorms on May 3 were well observed by a variety of satellites including ISEE 1,2 and IMP 8 in the magnetotail plus SCATHA, GOES, GMS, and LANL spacecraft at or near geostationary orbit. A particularly important feature of the 0111 UT substorm was the simultaneous imaging of the southern auroral oval by DE 1 and of the northern auroral oval by Viking. The excellent constellation of spacecraft near local midnight in the radial range 5-9 R[sub E] made it possible to study the strong cross-tail current development during the expansion phase. A clear latitudinal separation ([ge]10[degrees]) of the initial region of auroral brightening and the region of intense westward electrojet current was identified. The combined ground, near-tail and imaging data for this event provided an unprecedented opportunity to investigate tail current development, field line mapping, and substorm onset mechanisms. Evidence is presented for strong current diversion within the near-tail plasma sheet during the late growth phase and strong current disruption and field-aligned current formation from deeper in the tail at substorm onset. The authors conclude that these results are consistent with a model of magnetic neutral line formation in the late growth phase which causes plasma sheet current diversion before the substorm onset. The expansion phase onset occurs later due to reconnection of lobelike magnetic field lines and roughly concurrent cross-tail disruption in the inner plasma sheet region. 52 refs., 14 figs. 1 tab.

  10. Collective analysis of ORPS-reportable electrical events (June, 2005-August 2009)

    SciTech Connect

    Henins, Rita J; Hakonson - Hayes, Audrey C

    2010-01-01

    The analysis of LANL electrical events between June 30, 2005 and August 31, 2009 provides data that indicate some potential trends regarding ISM failure modes, activity types associated with reportable electrical events, and ORPS causal codes. This report discusses the identified potential trends for Shock events and compares attributes of the Shock events against Other Electrical events and overall ORPS-reportable events during the same time frame.

  11. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  12. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    PubMed Central

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  13. Event-by-Event pseudorapidity fluctuation analysis: An outlook to multiplicity and phase space dependence

    NASA Astrophysics Data System (ADS)

    Bhoumik, Gopa; Bhattacharyya, Swarnapratim; Deb, Argha; Ghosh, Dipak

    2016-07-01

    A detailed study of Event-by-Event pseudorapidity fluctuation of the pions produced in 16O -AgBr interactions at 60A GeV and 32S -AgBr interactions at 200A GeV has been carried out in terms of φ , a variable defined as a measure of fluctuation. Non-zero φ values indicate the presence of strong correlation among the pions for both interactions. Multiplicity and rapidity dependence of the Event-by-Event pseudorapidity fluctuation has been investigated. A decrease of φ with average multiplicity and increase of the same variable with pseudorapidity width are observed. Decrease of φ with average multiplicity is concluded as the particle emission by several independent sources occurs for higher-multiplicity events. The increase in φ values with pseudorapidity width, taken around central rapidity, might hint towards the presence of long-range correlation and its dominance over short range one. We have compared our experimental results with Monte Carlo simulation generated assuming independent particle emission. Comparison shows that the source of correlation and fluctuation is the dynamics of the pion production process. We have also compared our results with events generated by FRITIOF code. Such events also show the presence of fluctuation and correlation; however they fail to replicate the experimental findings.

  14. An event history analysis of union joining and leaving.

    PubMed

    Buttigieg, Donna M; Deery, Stephen J; Iverson, Roderick D

    2007-05-01

    This article examines parallel models of union joining and leaving using individual-level longitudinal panel data collected over a 5-year period. The authors utilized objective measures of joining and leaving collected from union and organizational records and took into account time by using event history analysis. The results indicated that union joining was negatively related to procedural justice and higher performance appraisals and positively related to partner socialization and extrinsic union instrumentality. Conversely, members were most likely to leave the union when they perceived lower procedural justice, where there was no union representative present in the workplace, and where they had individualistic orientations. The authors discuss the implications of these findings for theory and practice for trade unions. PMID:17484562

  15. Bootstrap analysis of the single subject with event related potentials.

    PubMed

    Oruç, Ipek; Krigolson, Olav; Dalrymple, Kirsten; Nagamatsu, Lindsay S; Handy, Todd C; Barton, Jason J S

    2011-07-01

    Neural correlates of cognitive states in event-related potentials (ERPs) serve as markers for related cerebral processes. Although these are usually evaluated in subject groups, the ability to evaluate such markers statistically in single subjects is essential for case studies in neuropsychology. Here we investigated the use of a simple test based on nonparametric bootstrap confidence intervals for this purpose, by evaluating three different ERP phenomena: the face-selectivity of the N170, error-related negativity, and the P3 component in a Posner cueing paradigm. In each case, we compare single-subject analysis with statistical significance determined using bootstrap to conventional group analysis using analysis of variance (ANOVA). We found that the proportion of subjects who show a significant effect at the individual level based on bootstrap varied, being greatest for the N170 and least for the P3. Furthermore, it correlated with significance at the group level. We conclude that the bootstrap methodology can be a viable option for interpreting single-case ERP amplitude effects in the right setting, probably with well-defined stereotyped peaks that show robust differences at the group level, which may be more characteristic of early sensory components than late cognitive effects. PMID:22292858

  16. CDAW-9 analysis of magnetospheric events on 3 May 1986: Event C. Technical report

    SciTech Connect

    Baker, D.N.; Pulkkinen, T.I.; McPherron, R.L.; Craven, J.D.; Frank, L.A.

    1993-10-01

    The ninth Coordinated Data Analysis Workshop (CDAW-9) focussed upon several intervals within the PROMIS period (March-June 1986). Event interval C comprised the period 0000-1200 UT on 3 May 1986 which was a highly disturbed time near the end of a geomagnetic storm interval. A very large substorm early in the period commenced at 0111 UT and had a peak AE index value of approx. 1500 nT. Subsequent activity was lower, but at least three other substorms occurred at 2-3 hour intervals. The substorms on 3 May were well observed by a variety of satellites, including ISEE-1, -2, and IMP-8 in the magnetotail plus SCATHA, GOES, GMS, and LANL spacecraft at or near geostationary orbit. A particularly important feature of the 0111 UT substorm was the simultaneous imaging of the southern auroral oval by DE-1 and of the northern oval by Viking. The excellent constellation of spacecraft near local midnight in the radial range 5-9 RE made it possible to study the strong cross-tail current development during the substorm growth phase and the current disruption and current wedge development during the expansion phase. The authors use a time-evolving magnetic field model to map observed auroral features out into the magnetospheric equatorial plane. There was both a dominant eastward and a weaker westward progression of activity following the expansion phase. A clear latitudinal separation of the initial region of auroral brightening and the region of intense westward electrojet current was identified.

  17. Analysis of marine stratocumulus clearing events during FIRE

    NASA Technical Reports Server (NTRS)

    Kloesel, Kevin A.

    1990-01-01

    During FIRE, three major stratocumulus clearing events took place over the project region. These clearing events are analyzed using synoptic variables to determine if these clearing events can be predicted by current modeling techniques. A preliminary statistical evaluation of the correlation between satellite cloud brightness parameters and NMC global model parameters is available in Wylie, et al., 1989.

  18. Analysis of Events Associated with First Charge of Desicooler Material

    SciTech Connect

    Alexander, D.E.

    2003-09-15

    HB-Line's mission included dissolution of uranium-aluminum scrap left over from a U3O8 scrap recovery program begun in 1972 with material returned from Rocky Flats and Oak Ridge. This material has been stored in desicooler containers, and is commonly referred to as the Desicoolers. The Scrap Recovery process includes the dissolution of scrap material and transfer of the resulting solution to H-Canyon for further disposition. During the first charge of this material into the HB-Line dissolvers, the solution heated to boiling without external heat being added. Yellow-colored fumes, which dissipated rapidly, were noted in the glovebox by operators, and a small amount of liquid was noted in the glovebox by operations after dissolver cooldown. This technical report documents analysis of the data from the event with respect to potential Safety Basis violation and the Integrated Safety Management System process. Based on the analysis presented, the safety basis has shown its ability to protect the worker, the facility and the public.

  19. Cluster analysis of indermediate deep events in the southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2015-04-01

    The Hellenic subduction zone (HSZ) is the seismically most active region in Europe where the oceanic African litosphere is subducting beneath the continental Aegean plate. Although there are numerous studies of seismicity in the HSZ, very few focus on the eastern HSZ and the Wadati-Benioff-Zone of the subducting slab in that part of the HSZ. In order to gain a better understanding of the geodynamic processes in the region a dense local seismic network is required. From September 2005 to March 2007, the temporary seismic network EGELADOS has been deployed covering the entire HSZ. It consisted of 56 onshore and 23 offshore broadband stations with addition of 19 stations from GEOFON, NOA and MedNet to complete the network. Here, we focus on a cluster of intermediate deep seismicity recorded by the EGELADOS network within the subducting African slab in the region of the Nysiros volcano. The cluster consists of 159 events at 80 to 190 km depth with magnitudes between 0.2 and 4.1 that were located using nonlinear location tool NonLinLoc. A double-difference earthquake relocation using the HypoDD software is performed with both manual readings of onset times and differential traveltimes obtained by separate cross correlation of P- and S-waveforms. Single event locations are compared to relative relocations. The event hypocenters fall into a thin zone close to the top of the slab defining its geometry with an accuracy of a few kilometers. At intermediate depth the slab is dipping towards the NW at an angle of about 30°. That means it is dipping steeper than in the western part of the HSZ. The edge of the slab is clearly defined by an abrupt disappearance of intermediate depths seismicity towards the NE. It is found approximately beneath the Turkish coastline. Furthermore, results of a cluster analysis based on the cross correlation of three-component waveforms are shown as a function of frequency and the spatio-temporal migration of the seismic activity is analysed.

  20. The Tunguska event and Cheko lake origin: dendrochronological analysis

    NASA Astrophysics Data System (ADS)

    Rosanna, Fantucci; Romano, Serra; Gunther, Kletetschka; Mario, Di Martino

    2015-07-01

    Dendrochronological research was carried out on 23 trees samples (Larix sibirica and Picea obovata) sampled during the 1999 expedition in two locations, close to the epicentre zone and near Cheko lake (N 60°57', E 101°51'). Basal Area Increment (BAI) analysis has shown a general long growth suppression before 1908, the year of Tunguska event (TE), followed by a sudden growth increase due to diminished competition of trees that died due to the event. In one group of the trees, we detected growth decrease for several years (due to damage to the trunk, branches and crown), followed by growth increase during the following 4-14 years. We show that trees that germinated after the TE, and living in close proximity of Cheko lake (Cheko lake trees) had different behaviour patterns when compared to those trees living further from Cheko lake, inside the forest (Forest trees). Cheko lake trees have shown a vigorous continuous growth increase. Forest trees have shown a vigorous growth during the first 10-30 years of age, followed by a period of suppressed growth. We interpret the suppressed growth by the re-established competition with the surroundings trees. Cheko lake pattern, however, is consistent with the formation of the lake at the time of TE. This observation supports the hypothesis that Cheko lake formation is due to a fragment originating during TE, creating a small impact crater into the permafrost and soft alluvial deposits of Kimku River plain. This is further supported by the fact that Cheko lake has an elliptical shape elongated towards the epicentre of TE.

  1. Event Detection and Spatial Analysis for Characterizing Extreme Precipitation

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Prabhat, M.; Byna, S.; Collins, W.; Wehner, M. F.

    2013-12-01

    Atmospheric Rivers (ARs) are large spatially coherent weather systems with high concentrations of elevated water vapor that often cause severe downpours and flooding over western coastal United States. With the availability of more atmospheric moisture in the future under global warming, we expect ARs to play an important role as a potential cause of extreme precipitation. We have recently developed TECA software for automatically identifying and tracking features in climate datasets. In particular, we are able to identify ARs that make landfall on the western coast of North America. This detection tool examines integrated water vapor field above a certain threshold and performs geometric analysis. Based on the detection procedure, we investigate impacts of ARs by exploring spatial extent of AR precipitation for CMIP5 simulations, and characterize spatial pattern of dependence for future projections under climate change within the framework of extreme value theory. The results show that AR events in RCP8.5 scenario (2076-2100) tend to produce heavier rainfall with higher frequency and longer duration than the events from historical run (1981-2005). Range of spatial dependence between extreme precipitations is concentrated on smaller localized area in California under the highest emission scenario than present day. Preliminary results are illustrated in Figure 1 and 2. Fig 1: Boxplot of annual max precipitation (left two) and max AR precipitation (right two) from GFDL-ESM2M during 25-year time period by station in California, US. Fig 2: Spatial dependence of max AR precipitation calculated from Station 4 (triangle) for historical run (left) and for future projections of RCP8.5 (right) from GFDL-ESM2M. Green and orange colors represent complete dependence and independence between two stations respectively.

  2. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  3. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  4. Teleseismic Events Analysis with AQDB and ITAB stations, Brazil

    NASA Astrophysics Data System (ADS)

    Felício, L. D.; Vasconcello, E.; Assumpção, M.; Rodrigues, F.; Facincani, E.; Dias, F.

    2013-05-01

    This work aims to preferentially conduct the survey of seismic activity coming from the Andean region at distance over 1500 km recorded by Brazilian seismographic stations of AQDB and ITAB in 2012. The stations are located in the cities of Aquidauana and Itajai, both in central-west region in Brazil, with coordinates -20°48'S;-55°70'W and -27°24'S;-52°13'W, respectively. We determined the magnitudes mb and Ms,epicentral distance, arrival times of P waves experimental and theoretical (using IASP91 model) . With the programs SAC (SEISMIC ANALYSIS CODE), TAUP and Seisgram (Seismogram Viewer), it was possible to determine the mentioned magnitudes. We identified around twenty events for each station and it was possible to correlate the magnitude data published in the Bulletin National Earthquake Information Center (NEIC) generating a correlation between the calculated magnitudes (AQDB and ITAB).. The linear regression shows that the two stations mb and Ms magnitude are close to the values reported by the NEIC (97.1% correlation mb and Ms 96.5%). Regarding the P-wave arrive times at stations ITAB and AQDB indicate an average variation of 2.2 and 2.7 seconds respectively, in other words, the time difference of the waves P (experimental and theoretical) may be related to positioning each station and the heterogeneity of the structure and composition of the rocky massive in each region.

  5. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  6. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  7. Regional Frequency Analysis of extreme rainfall events, Tuscany (Italy)

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Chiarello, V.; Rossi, G.

    2014-12-01

    The assessment of extreme hydrological events at sites characterized by short time series or where no data record exists has been mainly obtained by regional models. Regional frequency analysis based on the index variable procedure is implemented here to describe the annual maximum of rainfall depth of short durations in Tuscany region. The probability distribution TCEV - Two Component Extreme Value is used in the frame of the procedure for the parameters estimation based on a three levels hierarchical approach. The methodology deal with the delineation of homogeneous regions, the identification of a robust regional frequency distribution and the assessment of the scale factor, i.e. the index rainfall. The data set includes the annual maximum of daily rainfall of 351 gauge stations with at least 30 years of records, in the period 1916 - 2012, and the extreme rainfalls of short duration, 1 hour and 3, 6, 12, 24 hours. Different subdivisions hypotheses have been verified. A four regions subdivision, coincident with four subregions, which takes into account the orography, the geomorphological and climatic peculiarities of the Tuscany region, has been adopted. Particularly, for testing the regional homogeneity, the cumulate frequency distributions of the observed skewness and variation coefficients of the recorded times series, are compared with the theoretical frequency distribution obtained through a Monte Carlo technique. The related L-skewness and L-variation coefficients are also examined. The application of the Student t -test and the Wilcoxon test for the mean, as well as the χ2 was also performed. Further tests of subdivision hypotheses have been made through the application of discordancy D and heterogeneity H tests and the analysis of the observed and the theoretical TCEV model growth curves. For each region the daily rainfall growth curve has been estimated. The growth curves for the hourly duration have been estimated when the daily rainfall growth curve

  8. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  9. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  10. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  11. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  12. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  13. [MedDRA and its applications in statistical analysis of adverse events].

    PubMed

    Lu, Meng-jie; Liu, Yu-xiu

    2015-11-01

    Safety assessment in clinical trials is dependent on an in-depth analysis of the adverse events to a great extent. However, there are difficulties in summary classification, data management and statistical analysis of the adverse events because of the different expressions on the same adverse events caused by regional, linguistic, ethnic, cultural and other differences. In order to ensure the normative expressions, it's necessary to standardize the terms in recording the adverse events. MedDRA (medical dictionary for regulatory activities) has been widely recommended and applied in the world as a powerful support for the adverse events reporting in clinical trials. In this paper, the development history, applicable scope, hierarchy structure, encoding term selection and standardized query strategies of the MedDRA is introduced. Furthermore, the practical process of adverse events encoding with MedDRA is proposed. Finally, the framework of statistical analysis about adverse events is discussed. PMID:26911031

  14. Analysis of cumulus solar irradiance reflectance (CSIR) events

    NASA Astrophysics Data System (ADS)

    Laird, John L.; Harshvardhan

    Clouds are extremely important with regard to the transfer of solar radiation at Earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When Sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using UVA and UVB pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Win -2 and 0.0169 Wm -2 were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of Sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed. C 1997 Elsevier Science B.V.

  15. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  16. CDAW 9 analysis of magnetospheric events on May 3, 1986 - Event C

    NASA Technical Reports Server (NTRS)

    Baker, D. N.; Pulkkinen, T. I.; Mcpherron, R. L.; Craven, J. D.; Frank, L. A.; Elphinstone, R. D.; Murphree, J. S.; Fennell, J. F.; Lopez, R. E.; Nagai, T.

    1993-01-01

    An intense geomagnetic substorm event on May 3, 1986, occurring toward the end of a strong storm period, is studied. The auroral electrojet indices and global imaging data from both the Northern and Southern Hemispheres clearly revealed the growth phase and expansion phase development for a substorm with an onset at 0111 UT. An ideally located constellation of four spacecraft allowed detailed observation of the substorm growth phase in the near-tail region. A realistic time-evolving magnetic field model provided a global representation of the field configuration throughout the growth and early expansion phase of the substorm. Evidence of a narrowly localized substorm onset region in the near-earth tail is found. This region spread rapidly eastward and poleward after the 0111 UT onset. The results are consistent with a model of late growth phase formation of a magnetic neutral line. This reconnection region caused plasma sheet current diversion before the substorm onset and eventually led to cross-tail current disruption at the time of the substorm onset.

  17. Further Evaluation of Antecedent Social Events during Functional Analysis

    ERIC Educational Resources Information Center

    Kuhn, David E.; Hardesty, Samantha L.; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors…

  18. Application of Key Events Analysis to Chemical Carcinogens and Noncarcinogens

    PubMed Central

    BOOBIS, ALAN R.; DASTON, GEORGE P.; PRESTON, R. JULIAN; OLIN, STEPHEN S.

    2009-01-01

    The existence of thresholds for toxicants is a matter of debate in chemical risk assessment and regulation. Current risk assessment methods are based on the assumption that, in the absence of sufficient data, carcinogenesis does not have a threshold, while noncarcinogenic endpoints are assumed to be thresholded. Advances in our fundamental understanding of the events that underlie toxicity are providing opportunities to address these assumptions about thresholds. A key events dose-response analytic framework was used to evaluate three aspects of toxicity. The first section illustrates how a fundamental understanding of the mode of action for the hepatic toxicity and the hepatocarcinogenicity of chloroform in rodents can replace the assumption of low-dose linearity. The second section describes how advances in our understanding of the molecular aspects of carcinogenesis allow us to consider the critical steps in genotoxic carcinogenesis in a key events framework. The third section deals with the case of endocrine disrupters, where the most significant question regarding thresholds is the possible additivity to an endogenous background of hormonal activity. Each of the examples suggests that current assumptions about thresholds can be refined. Understanding inter-individual variability in the events involved in toxicological effects may enable a true population threshold(s) to be identified. PMID:19690995

  19. Wheels within Wheels: The Analysis of a Cultural Event.

    ERIC Educational Resources Information Center

    Court, Deborah

    2001-01-01

    A qualitative research methods course, offered for faculty at an Israeli school of education, was a significant "cultural event" promoting inclusion of qualitative methods in a strongly positivist setting. A profound change in faculty attitudes was eased by administrative support and by participants' ability to find entry points to ideas through…

  20. Event-based prediction of stream turbidity using a combined cluster analysis and classification tree approach

    NASA Astrophysics Data System (ADS)

    Mather, Amanda L.; Johnson, Richard L.

    2015-11-01

    Stream turbidity typically increases during streamflow events; however, similar event hydrographs can produce markedly different event turbidity behaviors because many factors influence turbidity in addition to streamflow, including antecedent moisture conditions, season, and supply of turbidity-causing materials. Modeling of sub-hourly turbidity as a function of streamflow shows that event model parameters vary on an event-by-event basis. Here we examine the extent to which stream turbidity can be predicted through the prediction of event model parameters. Using three mid-sized streams from the Mid-Atlantic region of the U.S., we show the model parameter set for each event can be predicted based on the event characteristics (e.g., hydrologic, meteorologic and antecedent moisture conditions) using a combined cluster analysis and classification tree approach. The results suggest that the ratio of beginning event discharge to peak event discharge (an estimate of the event baseflow index), as well as catchment antecedent moisture, are important factors in the prediction of event turbidity. Indicators of antecedent moisture, particularly those derived from antecedent discharge, account for the majority of the splitting nodes in the classification trees for all three streams. For this study, prediction of turbidity during streamflow events is based upon observed data (e.g., measured streamflow, precipitation and air temperature). However, the results also suggest that the methods presented here can, in future work, be used in conjunction with forecasts of streamflow, precipitation and air temperature to forecast stream turbidity.

  1. Early events in cell spreading as a model for quantitative analysis of biomechanical events.

    PubMed

    Wolfenson, Haguy; Iskratsch, Thomas; Sheetz, Michael P

    2014-12-01

    In this review, we focus on the early events in the process of fibroblast spreading on fibronectin matrices of different rigidities. We present a focused position piece that illustrates the many different tests that a cell makes of its environment before it establishes mature matrix adhesions. When a fibroblast is placed on fibronectin-coated glass surfaces at 37°C, it typically spreads and polarizes within 20-40 min primarily through αvβ3 integrin binding to fibronectin. In that short period, the cell goes through three major phases that involve binding, integrin activation, spreading, and mechanical testing of the surface. The advantage of using the model system of cell spreading from the unattached state is that it is highly reproducible and the stages that the cell undergoes can thus be studied in a highly quantitative manner, in both space and time. The mechanical and biochemical parameters that matter in this example are often surprising because of both the large number of tests that occur and the precision of the tests. We discuss our current understanding of those tests, the decision tree that is involved in this process, and an extension to the behavior of the cells at longer time periods when mature adhesions develop. Because many other matrices and integrins are involved in cell-matrix adhesion, this model system gives us a limited view of a subset of cellular behaviors that can occur. However, by defining one cellular process at a molecular level, we know more of what to expect when defining other processes. Because each cellular process will involve some different proteins, a molecular understanding of multiple functions operating within a given cell can lead to strategies to selectively block a function. PMID:25468330

  2. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  3. Application of a Temporal Reasoning Framework Tool in Analysis of Medical Device Adverse Events

    PubMed Central

    Clark, Kimberly K.; Sharma, Deepak K.; Chute, Christopher G.; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration’s (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events. PMID:22195199

  4. Observation and Analysis of Jovian and Saturnian Satellite Mutual Events

    NASA Technical Reports Server (NTRS)

    Tholen, David J.

    2001-01-01

    The main goal of this research was to acquire high time resolution photometry of satellite-satellite mutual events during the equatorial plane crossing for Saturn in 1995 and Jupiter in 1997. The data would be used to improve the orbits of the Saturnian satellites to support Cassini mission requirements, and also to monitor the secular acceleration of Io's orbit to compare with heat flow measurements.

  5. Analysis and RHBD technique of single event transients in PLLs

    NASA Astrophysics Data System (ADS)

    Zhiwei, Han; Liang, Wang; Suge, Yue; Bing, Han; Shougang, Du

    2015-11-01

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented.

  6. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  7. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  8. Analysis of Adverse Events in Identifying GPS Human Factors Issues

    NASA Technical Reports Server (NTRS)

    Adams, Catherine A.; Hwoschinsky, Peter V.; Adams, Richard J.

    2004-01-01

    The purpose of this study was to analyze GPS related adverse events such as accidents and incidents (A/I), Aviation Safety Reporting System (ASRS) reports and Pilots Deviations (PDs) to create a framework for developing a human factors risk awareness program. Although the occurrence of directly related GPS accidents is small the frequency of PDs and ASRS reports indicated there is a growing problem with situational awareness in terminal airspace related to different types of GPs operational issues. This paper addresses the findings of the preliminary research and a brief discussion of some of the literature on related GPS and automation issues.

  9. Analysis of broadband seismograms from selected IASPEI events

    USGS Publications Warehouse

    Choy, G.L.; Engdahl, E.R.

    1987-01-01

    Broadband seismograms of body waves that are flat to displacement and velocity in the frequency range from 0.01 to 5.0 Hz can now be routinely obtained for most earthquakes of magnitude greater than about 5.5. These records are obtained either directly or through multichannel deconvolution of waveforms from digitally recording seismograph stations. In contrast to data from conventional narrowband seismographs, broadband records have sufficient frequency content to define the source-time functions of body waves, even for shallow events for which the source functions of direct and surface-reflected phases may overlap. Broadband seismograms for selected IASPEI events are systematically analysed to identify depth phases and the presence of subevents. The procedure results in improved estimates of focal depth, identification of subevents in complex earthquakes, and better resolution of focal mechanisms. We propose that it is now possible for reporting agencies, such as the National Earthquake Information Center, to use broadband digital waveforms routinely in the processing of earthquake data. ?? 1987.

  10. Analysis of Continuous Microseismic Recordings: Resonance Frequencies and Unconventional Events

    NASA Astrophysics Data System (ADS)

    Tary, J.; van der Baan, M.

    2012-12-01

    Hydrofracture experiments, where fluids and proppant are injected into reservoirs to create fractures and enhance oil recovery, are often monitored using microseismic recordings. The total stimulated volume is then estimated by the size of the cloud of induced micro-earthquakes. This implies that only brittle failure should occur inside reservoirs during the fracturing. Yet, this assumption may not be correct, as the total energy injected into the system is orders of magnitude larger than the total energy associated with brittle failure. Instead of using only triggered events, it has been shown recently that the frequency content of continuous recordings may also provide information on the deformations occurring inside reservoirs. Here, we use different kinds of time-frequency transforms to track the presence of resonance frequencies. We analyze different data sets using regular, long-period and broadband geophones. The resonance frequencies observed are mainly included in the frequency band of 5-60 Hz. We systematically examine first the possible causes of resonance frequencies, dividing them into source, path and receiver effects. We then conclude that some of the observed frequency bands likely result from source effects. The resonance frequencies could be produced by either interconnected fluid-filled fractures in the order of tens of meters, or by small repetitive events occurring at a characteristic periodicity. Still, other mechanisms may occur or be predominant during reservoir fracturing, depending on the lithology as well as the pressure and temperature conditions at depth. During one experiment, both regular micro-earthquakes, long-period long-duration events (LPLD) and resonance frequencies are observed. The lower part of the frequency band of these resonance frequencies (5-30 Hz) overlaps with the anticipated frequencies of observed LPLDs in other experiments (<50 Hz). The exact origin of both resonance frequencies and LPLDs is still under debate

  11. Combined cardiotocographic and ST event analysis: A review.

    PubMed

    Amer-Wahlin, Isis; Kwee, Anneke

    2016-01-01

    ST-analysis of the fetal electrocardiogram (ECG) (STAN(®)) combined with cardiotocography (CTG) for intrapartum fetal monitoring has been developed following many years of animal research. Changes in the ST-segment of the fetal ECG correlated with fetal hypoxia occurring during labor. In 1993 the first randomized controlled trial (RCT), comparing CTG with CTG + ST-analysis was published. STAN(®) was introduced for daily practice in 2000. To date, six RCTs have been performed, out of which five have been published. Furthermore, there are six published meta-analyses. The meta-analyses showed that CTG + ST-analysis reduced the risks of vaginal operative delivery by about 10% and fetal blood sampling by 40%. There are conflicting results regarding the effect on metabolic acidosis, much because of controveries about which RCTs should be included in a meta-analysis, and because of differences in methodology, execution and quality of the meta-analyses. Several cohort studies have been published, some showing significant decrease of metabolic acidosis after the introduction of ST-analysis. In this review, we discuss not only the scientific evidence from the RCTs and meta-analyses, but also the limitations of these studies. In conclusion, ST-analysis is effective in reducing operative vaginal deliveries and fetal blood sampling but the effect on neonatal metabolic acidosis is still under debate. Further research is needed to determine the place of ST-analysis in the labor ward for daily practice. PMID:26206514

  12. Data integration and analysis using the Heliophysics Event Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal; Reardon, Kevin

    The Heliophysics Event Knowledgebase (HEK) system provides an integrated framework for automated data mining using a variety of feature-detection methods; high-performance data systems to cope with over 1TB/day of multi-mission data; and web services and clients for searching the resulting metadata, reviewing results, and efficiently accessing the data products. We have recently enhanced the capabilities of the HEK to support the complex datasets being produced by the Interface Region Imaging Spectrograph (IRIS). We are also developing the mechanisms to incorporate descriptions of coordinated observations from ground-based facilities, including the NSO's Dunn Solar Telescope (DST). We will discuss the system and its recent evolution and demonstrate its ability to support coordinated science investigations.

  13. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  14. Perceptually-driven signal analysis for acoustic event classification

    NASA Astrophysics Data System (ADS)

    Philips, Scott M.

    In many acoustic signal processing applications human listeners are able to outperform automated processing techniques, particularly in the identification and classification of acoustic events. The research discussed in this paper develops a framework for employing perceptual information from human listening experiments to improve automatic event classification. We focus on the identification of new signal attributes, or features, that are able to predict the human performance observed in formal listening experiments. Using this framework, our newly identified features have the ability to elevate automatic classification performance closer to the level of human listeners. We develop several new methods for learning a perceptual feature transform from human similarity measures. In addition to providing a more fundamental basis for uncovering perceptual features than previous approaches, these methods also lead to a greater insight into how humans perceive sounds in a dataset. We also develop a new approach for learning a perceptual distance metric. This metric is shown to be applicable to modern kernel-based techniques used in machine learning and provides a connection between the fields of psychoacoustics and machine learning. Our research demonstrates these new methods in the area of active sonar signal processing. There is anecdotal evidence within the sonar community that human operators are adept in the task of discriminating between active sonar target and clutter echoes. We confirm this ability in a series of formal listening experiments. With the results of these experiments, we then identify perceptual features and distance metrics using our novel methods. The results show better agreement with human performance than previous approaches. While this work demonstrates these methods using perceptual similarity measures from active sonar data, they are applicable to any similarity measure between signals.

  15. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  16. A regional analysis of event runoff coefficients with respect to climate and catchment characteristics in Austria

    NASA Astrophysics Data System (ADS)

    Merz, Ralf; BlöSchl, Günter

    2009-01-01

    In this paper we analyze the controls on the spatiotemporal variability of event runoff coefficients. A total of about 64,000 events in 459 Austrian catchments ranging from 5 to 10000 km2 are analyzed. Event runoff coefficients vary in space, depending on the long-term controls such as climate and catchment formation. Event runoff coefficients also vary in time, depending on event characteristics such as antecedent soil moisture and event rainfall depth. Both types of controls are analyzed separately in the paper. The spatial variability is analyzed in terms of a correlation analysis of the statistical moments of the runoff coefficients and catchment attributes. Mean runoff coefficients are most strongly correlated to indicators representing climate such as mean annual precipitation and the long-term ratio of actual evaporation to precipitation through affecting long-term soil moisture. Land use, soil types, and geology do not seem to exert a major control on runoff coefficients of the catchments under study. The temporal variability is analyzed by comparing the deviation of the event runoff coefficients from their mean depending on event characteristics. The analysis indicates that antecedent soil moisture conditions control runoff coefficients to a higher degree than does event rainfall. The analysis also indicates that soil moisture derived from soil moisture accounting schemes has more predictive power for the temporal variability of runoff coefficients than antecedent rainfall.

  17. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  18. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26689725

  19. Technical issues: flow cytometry and rare event analysis.

    PubMed

    Hedley, B D; Keeney, M

    2013-06-01

    Flow cytometry has become an essential tool for identification and characterization of hematological cancers and now, due to technological improvements, allows the identification and rapid enumeration of small tumor populations that may be present after induction therapy (minimal residual disease, MRD). The quantitation of MRD has been shown to correlate with relapse and survival rates in numerous diseases and in certain cases, and evidence of MRD is used to alter treatment protocols. Recent improvements in hardware allow for high data rate collection. Improved fluorochromes take advantage of violet laser excitation and maximize signal-to-noise ratio allowing the population of interest to be isolated in multiparameter space. This isolation, together with a low background rate, permits for detection of residual tumor populations in a background of normal cells. When counting such rare events, the distribution is governed by Poisson statistics, with precision increasing with higher numbers of cells collected. In several hematological malignancies, identification of populations at frequencies of 0.01% and lower has been attained. The choice of antibodies used in MRD detection facilitates the definition of a fingerprint to identify abnormal populations throughout treatment. Tumor populations can change phenotype, and an approach that relies on 'different from normal' has proven useful, particularly in the acute leukemias. Flow cytometry can and is used for detection of MRD in many hematological diseases; however, standardized approaches for specific diseases must be developed to ensure precise identification and enumeration that may alter the course of patient treatment. PMID:23590661

  20. Statistical analysis of geodetic networks for detecting regional events

    NASA Technical Reports Server (NTRS)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  1. Twelve Tips for Promoting Significant Event Analysis To Enhance Reflection in Undergraduate Medical Students.

    ERIC Educational Resources Information Center

    Henderson, Emma; Berlin, Anita; Freeman, George; Fuller, Jon

    2002-01-01

    Points out the importance of the facilitation of reflection and development of reflective abilities in professional development and describes 12 tips for undergraduate medical students to increase their abilities of writing reflective and creative event analysis. (Author/YDS)

  2. An analysis of fog events at Belgrade International Airport

    NASA Astrophysics Data System (ADS)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  3. Analysis of 16 plasma vortex events in the geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Birn, J.; Hones, E. W., Jr.; Bame, S. J.; Russell, C. T.

    1985-01-01

    The analysis of 16 plasma vortex occurrences in the magnetotail plasma sheet of Hones et al. (1983) is extended. Two- and three-dimensional plasma measurements and three-dimensional magnetic field measurements were used to study phase relations, energy propagation, and polarization properties. The results point toward an interpretation as a slow strongly damped MHD eigenmode which is generated by tailward traveling perturbations at the low-latitude interface between plasma sheet and magnetosheath.

  4. Subjective well-being and adaptation to life events: a meta-analysis.

    PubMed

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E

    2012-03-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on affective and cognitive well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to 4 family events (marriage, divorce, bereavement, childbirth) and 4 work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843

  5. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  6. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  7. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-07-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  8. Modeling and analysis of single-event transients in charge pumps

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Junfeng, Li; Minxuan, Zhang; Shaoqing, Li

    2009-05-01

    It has been shown that charge pumps (CPs) dominate single-event transient (SET) responses of phase-locked loops (PLLs). Using a pulse to represent a single event hit on CPs, the SET analysis model is established and the characteristics of SET generation and propagation in PLLs are revealed. An analysis of single event transients in PLLs demonstrates that the settling time of the voltage-controlled oscillators (VCOs) control voltage after a single event strike is strongly dependent on the peak control voltage deviation, the SET pulse width, and the settling time constant. And the peak control voltage disturbance decreases with the SET strength or the filter resistance. Furthermore, the analysis in the proposed PLL model is confirmed by simulation results using MATLAB and HSPICE, respectively.

  9. Long-term Statistical Analysis of the Simultaneity of Forbush Decrease Events at Middle Latitudes

    NASA Astrophysics Data System (ADS)

    Lee, Seongsuk; Oh, Suyeon; Yi, Yu; Evenson, Paul; Jee, Geonhwa; Choi, Hwajin

    2015-03-01

    Forbush Decreases (FD) are transient, sudden reductions of cosmic ray (CR) intensity lasting a few days, to a week. Such events are observed globally using ground neutron monitors (NMs). Most studies of FD events indicate that an FD event is observed simultaneously at NM stations located all over the Earth. However, using statistical analysis, previous researchers verified that while FD events could occur simultaneously, in some cases, FD events could occur non-simultaneously. Previous studies confirmed the statistical reality of non-simultaneous FD events and the mechanism by which they occur, using data from high-latitude and middle-latitude NM stations. In this study, we used long-term data (1971-2006) from middle-latitude NM stations (Irkutsk, Climax, and Jungfraujoch) to enhance statistical reliability. According to the results from this analysis, the variation of cosmic ray intensity during the main phase, is larger (statistically significant) for simultaneous FD events, than for non-simultaneous ones. Moreover, the distribution of main-phase-onset time shows differences that are statistically significant. While the onset times for the simultaneous FDs are distributed evenly over 24- hour intervals (day and night), those of non-simultaneous FDs are mostly distributed over 12-hour intervals, in daytime. Thus, the existence of the two kinds of FD events, according to differences in their statistical properties, were verified based on data from middle-latitude NM stations.

  10. Computational methods for analysis of dynamic events in cell migration.

    PubMed

    Castañeda, V; Cerda, M; Santibáñez, F; Jara, J; Pulgar, E; Palma, K; Lemus, C G; Osorio-Reich, M; Concha, M L; Härtel, S

    2014-02-01

    Cell migration is a complex biological process that involves changes in shape and organization at the sub-cellular, cellular, and supra-cellular levels. Individual and collective cell migration can be assessed in vitro and in vivo starting from the flagellar driven movement of single sperm cells or bacteria, bacterial gliding and swarming, and amoeboid movement to the orchestrated movement of collective cell migration. One key technology to access migration phenomena is the combination of optical microscopy with image processing algorithms. This approach resolves simple motion estimation (e.g. preferred direction of migrating cells or path characteristics), but can also reveal more complex descriptors (e.g. protrusions or cellular deformations). In order to ensure an accurate quantification, the phenomena under study, their complexity, and the required level of description need to be addressed by an adequate experimental setup and processing pipeline. Here, we review typical workflows for processing starting with image acquisition, restoration (noise and artifact removal, signal enhancement), registration, analysis (object detection, segmentation and characterization) and interpretation (high level understanding). Image processing approaches for quantitative description of cell migration in 2- and 3-dimensional image series, including registration, segmentation, shape and topology description, tracking and motion fields are presented. We discuss advantages, limitations and suitability for different approaches and levels of description. PMID:24467201

  11. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  12. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    NASA Astrophysics Data System (ADS)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  13. Human Reliability Analysis for Small Modular Reactors

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  14. Analysis of single events in ultrarelativistic nuclear collisions: A new method to search for critical fluctuations

    SciTech Connect

    Stock, R.

    1995-07-15

    The upcoming generation of experiments with ultrarelativistic heavy nuclear projectiles, at the CERN SPS and at RHIC and LHC, will confront researchers with several thousand identified hadrons per event, suitable detectors provided. An analysis of individual events becomes meaningful concerning a multitude of hadronic signals thought to reveal a transient deconfinement phase transition, or the related critical precursor fluctuations. Transverse momentum spectra, the kaon to pion ratio, and pionic Bose-Einstein correlation are examined, showing how to separate the extreme, probably rare candidate events from the bulk of average events. This type of observables can already be investigated with the Pb beam of the SPS. The author then discusses single event signals that add to the above at RHIC and LHC energies, kaon interferometry, rapidity fluctuation, jet and {gamma} production.

  15. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. PMID:26928768

  16. [Analysis of the impact of two typical air pollution events on the air quality of Nanjing].

    PubMed

    Wang, Fei; Zhu, Bin; Kang, Han-Qing; Gao, Jin-Hui; Wang, Yin; Jiang, Qi

    2012-10-01

    Nanjing and the surrounding area have experienced two consecutive serious air pollution events from late October to early November in 2009. The first event was long-lasting haze pollution, and the second event was resulted from the mixed impact of crop residue burning and local transportation. The effects of regional transport and local sources on the two events were discussed by cluster analysis, using surface meteorological observations, air pollution index, satellite remote sensing of fire hot spots data and back trajectory model. The results showed that the accumulation-mode aerosol number concentrations were higher than those of any other aerosol modes in the two pollution processes. The peak value of aerosol particle number concentrations shifted to large particle size compare with the previous studies in this area. The ratio of SO4(2-)/NO3(-) was 1.30 and 0.99, indicating that stationary sources were more important than traffic sources in the first event and the reverse in the second event. Affected by the local sources from east and south, the particle counts below 0.1 microm gradually accumulated in the first event. The second event was mainly affected by a short-distance transport from northeast and local sources from southwest, especially south, the concentration of aerosol particles was higher than those in other directions, indicating that the sources of crop residue burning were mainly in this direction. PMID:23234001

  17. Seasonality analysis of hydrological characteristics and flash flood events in Greece

    NASA Astrophysics Data System (ADS)

    Koutroulis, A. G.; Tsanis, I. K.

    2009-04-01

    The seasonality of flash flood occurrence is strongly connected to the climate forcing mechanisms of each region. Hydrological characteristics such as precipitation and stream flow depict the regional climate mechanisms. Comparison of daily and mean monthly seasonality of selected precipitation and runoff characteristics reveals valuable information within the context of flood occurrence. The present study presents the preliminary findings of the seasonality analysis of flash flood events that occurred in Greece during the 1925 - 2007 period in combination with a seasonality analysis of their hydrological characteristics. A two level approach at national (Greece) and regional (Crete Island) level was followed, using a total of 206 flood events. Twenty two of these flood events enriched the European Flash Flood database, which is being developed in the HYDRATE project. The analysis of hydrological characteristics through seasonality indices was based on a dataset of 83 monthly and daily precipitation stations and additionally 22 monthly and 15 daily flow stations. Analysis concludes that on the island of Crete, the flood event-based seasonality coincides with the seasonality of the daily precipitation maxima during December and January. The seasonality of the 3 largest long term daily precipitation maxima indicates that 50% of the maximum precipitation events occur during and the November -December - January (NDJ) period. The event based seasonality analysis for Greece indicated that 57% of the events occur during the NDJ period. The annual maximum daily precipitation is lagging behind by approximately one month to the maximum annual stream flows for Crete. This is due to the snow melting process, the low soil percolation rates of winter period and the high baseflow of the local karstic aquifers that contribute to the maximum flows. The results will be compared with six different hydrometeorological regions within Europe in the frame of HYDRATE project, in order to

  18. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  19. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-05-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analyzing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multi-channel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multi-component waveforms into the ray-centered co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, i.e. microseismic events for which only one of the S- or P-wave arrival is evident due to unfavorable S/N conditions. A real-data example using microseismic monitoring data from 4 stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than four-fold increase) in the number of located events compared with the original catalog. Moreover, analysis of the new MFA catalog suggests that this approach leads to more robust interpretation of the

  20. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  1. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2016-04-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The cluster recorded during the deployment of the temporary seismic network EGELADOS consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. The network itself consisted of 56 onshore and 23 offshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. It was deployed from September 2005 to March 2007 and it covered the entire HSZ. Here, both spatial and temporal clustering of the recorded events is studied by using the three component similarity analysis. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The results are shown as a function of frequency for individual stations and as averaged values over the network. The cross-correlation coefficients at the single stations show a decreasing similarity with increasing epicentral distance as well as the effect of local heterogeneities at particular stations, causing noticeable differences in waveform similarities. Event relocation was performed by using the double-difference earthquake relocation software HypoDD and the results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential travel times obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, portion and occurrence time of the aftershocks is varied, it is shown that the event

  2. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  3. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  4. EVENT TREE ANALYSIS AT THE SAVANNAH RIVER SITE: A CASE HISTORY

    SciTech Connect

    Williams, R

    2009-05-25

    At the Savannah River Site (SRS), a Department of Energy (DOE) installation in west-central South Carolina there is a unique geologic stratum that exists at depth that has the potential to cause surface settlement resulting from a seismic event. In the past the particular stratum in question has been remediated via pressure grouting, however the benefits of remediation have always been debatable. Recently the SRS has attempted to frame the issue in terms of risk via an event tree or logic tree analysis. This paper describes that analysis, including the input data required.

  5. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    SciTech Connect

    Lisbeth A. Mitchell

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  6. Complete dose analysis of the November 12, 1960 solar cosmic ray event.

    PubMed

    Masley, A J; Goedeke, A D

    1963-01-01

    A detailed analysis of the November 12, 1960 solar cosmic ray event is presented as an integrated space flux and dose. This event is probably the most interesting solar cosmic ray event studied to date. Direct measurements were made of solar protons from 10 MeV to 6 GeV. During the double peaked high energy part of the event evidence is presented for the trapping of relativistic particles in a magnetic cloud. The proton energy spectrum is divided into 3 energy intervals, with separate energy power law exponents and time profiles carried through for each. The three groups are: (1) (30analysis are the results of rocket measurements which determined the spectrum down to 10 MeV twice during the event, balloon results from Fort Churchill and Minneapolis, earth satellite measurements, neutron monitors in New Hampshire and at both the North and South Pole and riometer results from Alaska and Kiruna, Sweden. The results are given in Table 1 [see text]. The results of our analyses of other solar cosmic ray events are also included with a general discussion of the solar flare hazards in space. PMID:12056429

  7. FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

    SciTech Connect

    Griffin, G.B.

    1988-12-01

    The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.

  8. Analysis of Pressurized Water Reactor Primary Coolant Leak Events Caused by Thermal Fatigue

    SciTech Connect

    Atwood, Corwin Lee; Shah, Vikram Naginbhai; Galyean, William Jospeh

    1999-09-01

    We present statistical analyses of pressurized water reactor (PWR) primary coolant leak events caused by thermal fatigue, and discuss their safety significance. Our worldwide data contain 13 leak events (through-wall cracking) in 3509 reactor-years, all in stainless steel piping with diameter less than 25 cm. Several types of data analysis show that the frequency of leak events (events per reactor-year) is increasing with plant age, and the increase is statistically significant. When an exponential trend model is assumed, the leak frequency is estimated to double every 8 years of reactor age, although this result should not be extrapolated to plants much older than 25 years. Difficulties in arresting this increase include lack of quantitative understanding of the phenomena causing thermal fatigue, lack of understanding of crack growth, and difficulty in detecting existing cracks.

  9. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    SciTech Connect

    Attrill, Gemma D. R.

    2010-07-20

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  10. Human factors analysis and design methods for nuclear waste retrieval systems. Volume IV. Computerized Event-Tree Analysis Technique

    SciTech Connect

    Deretsky, Z.; Casey, S.M.

    1980-08-01

    This document contains a program listing and brief description of CETAT, the Computerized Event-Tree Analysis Technique. CETAT was developed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities associated with the tasks required during the retrieval of spent fuel canisters. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks.

  11. Measurements and data analysis of suburban development impacts on runoff event characteristics and unit hydrographs

    NASA Astrophysics Data System (ADS)

    Sillanpää, Nora; Koivusalo, Harri

    2014-05-01

    Urbanisation strongly changes the catchment hydrological response to rainfall. Monitoring data on hydrological variables are most commonly available from rural and large areas, but less so from urban areas, and rarely from small catchments undergoing hydrological changes during the construction processes associated with urban development. Moreover, changes caused by urbanisation in the catchment hydrological response to snowmelt have not been widely studied. In this study, the changes occurring in runoff generation were monitored in a developing catchment under construction and in two urban control catchments. The developing catchment experienced extreme change from forest to a suburban residential area. The data used included rainfall and runoff observations from a five-year period (the years 2001-2006) with 2 to 10 minute temporal resolution. In total, 636 and 239 individual runoff events were investigated for summer and winter conditions, respectively. The changes occurring in runoff event characteristics such as event runoff volumes, peak flow rates, mean runoff intensities, and volumetric runoff coefficients were identified by the means of exploratory data analysis and nonparametric comparison tests (the Kruskall-Wallis and the Mann-Whitney tests). The effect of urbanization on event runoff dynamics was investigated using instantaneous unit hydrographs (IUH) based on a two-parameter gamma distribution. The measurements and data analyses demonstrated how the impact of urbanization on runoff was best detected based on peak flow rates, volumetric runoff coefficients, and mean runoff intensities. Control catchments were essential to distinguish the hydrological impact caused by catchment characteristics from those caused by changes in the meteorological conditions or season. As the imperviousness of the developing catchment increased from 1.5% to 37%, significant increases were observed in event runoff depths and peak flows during rainfall-runoff events. At the

  12. Two Point Autocorrelation Analysis of Auger Highest Energy Events Backtracked in Galactic Magnetic Field

    NASA Astrophysics Data System (ADS)

    Petrov, Yevgeniy

    2009-10-01

    Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.

  13. Analysis of suprathermal proton events observed by STEREO/PLASTIC focusing on the observation of bow shock/magnetospheric events

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Popecki, M.; Klecker, B.; Kucharek, H.; Simunac, K.; Farrugia, C. J.; Luhmann, J. G.; Jian, L. K.

    2013-06-01

    The topic of suprathermal and energetic ion events upstream of the Earth's bow shock has been investigated since the late 1960's. Over the past 50 years, these events have been characterized as having energies ranging from just above the solar wind energies on through 2MeV, time spans of minutes to hours, and particle distributions ranging from field aligned to isotropic. The seed particles of these events accelerated within the magnetosphere and/or at the Earth's bow shock have been shown to be of ions originating in the magnetosphere, solar wind, as well as ions energized in other heliospheric processes (such as solar energetic particle (SEP), corotating interaction regions (CIRs), Pick-up ions, etc.). In this study we utilize STEREO/PLASTIC to examine bow shock/magnetospheric energetic proton events observed throughout 2007 in the region far upstream of the Earth's ion foreshock. To do this, we first employ an automated procedure to identify suprathermal proton events in the energy range of 4keV up to 80keV. The occurrence of events, magnetic connection to the Earth, and Compton-Getting transformed energy spectra of 66 possible STA bow shock/magnetospheric events are investigated as a function of spacecraft-Earth separation.

  14. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  15. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  16. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  17. An Event History Analysis of Teacher Attrition: Salary, Teacher Tracking, and Socially Disadvantaged Schools

    ERIC Educational Resources Information Center

    Kelly, Sean

    2004-01-01

    In this event history analysis of the 1990-1991 Schools and Staffing Survey and the 1992 Teacher Follow-up Survey, a retrospective person-year database was constructed to examine teacher attrition over the course of the teaching career. Consistent with prior research, higher teacher salaries reduced attrition, but only slightly so. Teacher…

  18. Uniting Secondary and Postsecondary Education: An Event History Analysis of State Adoption of Dual Enrollment Policies

    ERIC Educational Resources Information Center

    Mokher, Christine G.; McLendon, Michael K.

    2009-01-01

    This study, as the first empirical test of P-16 policy antecedents, reports the findings from an event history analysis of the origins of state dual enrollment policies adopted between 1976 and 2005. First, what characteristics of states are associated with the adoption of these policies? Second, to what extent do conventional theories on policy…

  19. Analysis of adverse events of sunitinib in patients treated for advanced renal cell carcinoma

    PubMed Central

    Cedrych, Ida; Jasiówka, Marek; Niemiec, Maciej; Skotnicki, Piotr

    2016-01-01

    Introduction Treatment of the metastatic stage of renal cell carcinoma is specific because classical chemotherapy is not applicable here. The treatment is mainly based on molecularly targeted drugs, including inhibitors of tyrosine kinases. In many cases the therapy takes many months, and patients often report to general practitioners due to adverse events. In this article, the effectiveness and side effects of one of these drugs are presented. The aim of the study was to analyse of the toxicity and safety of treatment with sunitinib malate in patients with clear cell renal cell carcinoma in the metastatic stage. Material and methods Adverse events were analyzed using retrospective analysis of data collected in a group of 39 patients treated in the Department of Systemic and Generalized Malignancies in the Cancer Center in Krakow, Poland. Results Toxicity of treatment affected 50% of patients. The most common side effects observed were hypertension, thrombocytopenia, stomatitis, diarrhea and weakness. Grade 3 serious adverse events according to Common Terminology Criteria for Adverse Events (CTCAE) version 4 affected up to 10% of patients. The most common serious adverse events were hypertension and fatigue. Conclusions Sunitinib malate is characterized by a particular type of toxicity. Knowledge of the types and range of adverse events of this drug is an important part of oncological and internal medicine care. PMID:27186181

  20. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  1. Analysis of extreme top event frequency percentiles based on fast probability integration

    SciTech Connect

    Staple, B.; Haskin, F.E.

    1993-10-01

    In risk assessments, a primary objective is to determine the frequency with which a collection of initiating and basic events, E{sub e} leads to some undesired top event, T. Uncertainties in the occurrence rates, x{sub t}, assigned to the initiating and basic events cause uncertainty in the top event frequency, z{sub T}. The quantification of the uncertainty in z{sub T} is an essential part of risk assessment called uncertainty analysis. In the past, it has been difficult to evaluate the extreme percentiles of output variables like z{sub T}. Analytic methods such as the method of moments do not provide estimates of output percentiles and the Monte Carlo (MC) method can be used to estimate extreme output percentiles only by resorting to large sample sizes. A promising altemative to these methods is the fast probability integration (FPI) methods. These methods approximate the integrals of multi-variate functions, representing percentiles of interest, without recourse to multi-dimensional numerical integration. FPI methods give precise results and have been demonstrated to be more efficient than MC methods for estimating extreme output percentiles. FPI allows the analyst to choose extreme percentiles of interest and perform sensitivity analyses in those regions. Such analyses can provide valuable insights as to the events driving the top event frequency response in extreme probability regions. In this paper, FPI methods are adapted a) to precisely estimate extreme top event frequency percentiles and b) to allow the quantification of sensitivity measures at these extreme percentiles. In addition, the relative precision and efficiency of alternative methods for treating lognormally distributed inputs is investigated. The methodology is applied to the top event frequency expression for the dominant accident sequence from a risk assessment of Grand Gulf nuclear power plant.

  2. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  3. Analysis of the longitudinal dependence of the downstream fluence of large solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Pacheco, Daniel; Sanahuja, Blai; Aran, Angels; Agueda, Neus; Jiggens, Piers

    2016-07-01

    Simulations of the solar energetic particle (SEP) intensity-time profiles are needed to estimate the radiation environment for interplanetary missions. At present, the physics-based models applied for such a purpose, and including a moving source of particles, are not able to model the portion of the SEP intensity enhancement occurring after the coronal/interplanetary shock crossing by the observer (a.k.a. the downstream region). This is the case, for example, of the shock-and-particle model used to build the SOLPENCO2 code. SOLPENCO2 provides the statistical modelling tool developed in the ESA/SEPEM project for interplanetary missions with synthetic SEP event simulations for virtual spacecraft located at heliocentric distances between 0.2 AU and 1.6 AU (http://dev.sepem.oma.be/). In this work we present an analysis of 168 individual SEP events observed at 1 AU from 1988 to 2013. We identify the solar eruptive phenomena associated with these SEP events, as well as the in-situ passage of interplanetary shocks. For each event, we quantify the amount of fluence accounted in the downstream region, i.e. after the passage of the shock, at the 11 SEPEM reference energy channels (i.e., from 5 to 300 MeV protons). First, from the subset of SEP events simultaneously detected by near Earth spacecraft (using SEPEM reference data) and by one of the STEREO spacecraft, we select those events for which the downstream region can be clearly determined. From the 8 selected multi-spacecraft events, we find that the western observations of each event have a minor downstream contribution than their eastern counterpart, and that the downstream-to-total fluence ratio of these events decreases as a function of the energy. Hence, there is a variation of the downstream fluence with the heliolongitude in SEP events. Based on this result, we study the variation of the downstream-to-total fluence ratios of the total set of individual events. We confirm the eastern-to-western decrease of the

  4. Semiparametric Transformation Models with Random Effects for Joint Analysis of Recurrent and Terminal Events

    PubMed Central

    Zeng, Donglin; Lin, D. Y.

    2011-01-01

    Summary We propose a broad class of semiparametric transformation models with random effects for the joint analysis of recurrent events and a terminal event. The transformation models include proportional hazards/intensity and proportional odds models. We estimate the model parameters by the nonparametric maximum likelihood approach. The estimators are shown to be consistent, asymptotically normal, and asymptotically efficient. Simple and stable numerical algorithms are provided to calculate the parameter estimators and to estimate their variances. Extensive simulation studies demonstrate that the proposed inference procedures perform well in realistic settings. Applications to two HIV/AIDS studies are presented. PMID:18945267

  5. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    Energy Science and Technology Software Center (ESTSC)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  6. Analysis of Suprathermal Events Observed by STEREO/PLASTIC with a Focus on Upstream/Magnetospheric Events

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Popecki, M.; Ellis, L.; Klecker, B.; Lee, M. A.; Kucharek, H.

    2010-05-01

    The topic of suprathermal and energetic ion events upstream of the Earth's bow shock has been a topic of investigation since the late 1960's. Over the past 50 years these events have been characterized as having energies ranging from just above the solar wind energies on up to 2MeV, time spans of minutes to hours, and particle distribution functions ranging from field aligned to isotropic. The possible sources of these ions include magnetospheric ions and solar wind ions accelerated between the Earth's bow shock and low-frequency large amplitude waves in the ion foreshock. Also, energetic ions from other heliospheric processes (such as Solar Energetic Particle (SEP) events or Corotating Interaction Regions (CIRs)) can be further accelerated at the Earth's bow shock. Utilizing the particularly quiet solar minimum and the unique orbit of STEREO-A (STA), drifting ahead of the Earth in its heliocentric orbit, we are able to examine field-aligned upstream/magnetospheric energetic ion events in the unexamined region far upstream of the Earth's ion foreshock. Using both the PLASTIC and IMPACT instruments on board STA we have examined protons throughout 2007 in the energy range of 4keV up to 80keV. We find that the occurrence of automatically defined suprathermal events falls off with increasing STA-Earth separation. More importantly, it is shown through a crude approximation of the magnetic field via the Parker spiral that after a STA-Earth separation of about 3000Re it is unlikely that the Earth and STA will be magnetically connected. This corresponds well with the observed cutoff of the occurrence of suprathermal events with field-aligned anisotropies. The detection of upstream/magnetospheric events at these large distances from the Earth's bow shock indicates that the ions propagate relatively scatter free beyond the ion foreshock.

  7. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    NASA Astrophysics Data System (ADS)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  10. A canonical correlation analysis based method for contamination event detection in water sources.

    PubMed

    Li, Ruonan; Liu, Shuming; Smith, Kate; Che, Han

    2016-06-15

    In this study, a general framework integrating a data-driven estimation model is employed for contamination event detection in water sources. Sequential canonical correlation coefficients are updated in the model using multivariate water quality time series. The proposed method utilizes canonical correlation analysis for studying the interplay between two sets of water quality parameters. The model is assessed by precision, recall and F-measure. The proposed method is tested using data from a laboratory contaminant injection experiment. The proposed method could detect a contamination event 1 minute after the introduction of 1.600 mg l(-1) acrylamide solution. With optimized parameter values, the proposed method can correctly detect 97.50% of all contamination events with no false alarms. The robustness of the proposed method can be explained using the Bauer-Fike theorem. PMID:27264637

  11. The Logic of Surveillance Guidelines: An Analysis of Vaccine Adverse Event Reports from an Ontological Perspective

    PubMed Central

    Courtot, Mélanie; Brinkman, Ryan R.; Ruttenberg, Alan

    2014-01-01

    Background When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. Methods and Findings Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. Conclusions By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. Availability Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero. PMID:24667848

  12. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  13. Analysis and visualization of single-trial event-related potentials.

    PubMed

    Jung, T P; Makeig, S; Westerfield, M; Townsend, J; Courchesne, E; Sejnowski, T J

    2001-11-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  14. Determination of Main Periodicities in Solar Wind and Magnetosphere Data During HILDCAAs Events Using Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    de Souza, A. M.; Echer, E.; Bolzam, M. J. A.

    2015-12-01

    The High-Intensity Long-Duration Continuous AE activity events (HILDCAAs) were first identified by Tsurutani and Gonzalez (1987), when they studied geomagnetic storms with a recovery phase longer than what is generally observed. They have used four criteria for defining the HILDCAA events, that are: First, the AE index must be 1000 nT at least once during the event; second, the event must be at last two days long; third, the AE index can not decay more than 200 nT for longer than two hours in each time; finally, the event must occurs outside of the main phase of the geomagnetic storm. Although several works have been done recently on HILCAAS, the main periodicities in solar wind and magnetosphere parameters during these events are still not well know. It is the aim of this work to determine these periods. In order to conduct this study, the global spectrum wavelet was used to determine the main periods of HILDCAA events. The 1-minute AE index and the Bz component of the interplanetary magnetic field (IMF) were used to characterize the magnetosphic and solar wind. We have used data of events that occurred between 1975 and 2011 for the AE index, and between 1995 and 2011 for Bz component of the IMF (GSE and GSM coordinates systems). During HILDCAAs events, the main periods found in the AE index were between 4 and 12 hours, corresponding to 50% of the total periods identified. For the Bz component, the main periods were ≤ 8 hours, independently of the coordinate system used. We conjecture that those periods can be associates with Alfvén waves that present periods between 1 and 10 hours. These Alfven waves are associated to coronal holes because the HILDCAAs events occur more often in the descending phase of solar cycles, when the high speed streams are dominant and it are emitted from coronal holes. Cross-wavelet analysis results between IMF Bz and AE are also presented and discussed.

  15. Final Report for Dynamic Models for Causal Analysis of Panel Data. Dynamic Analysis of Event Histories. Part III, Chapter 1.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, examines sociological research methods for the study of change. The advantages and procedures for dynamic analysis of event-history data (data giving the number, timing, and sequence of changes in a categorical dependent variable) are considered. The authors argue for grounding…

  16. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  17. Using Simple Statistical Analysis of Historical Data to Understand Wind Ramp Events

    SciTech Connect

    Kamath, C

    2010-01-29

    As renewable resources start providing an increasingly larger percentage of our energy needs, we need to improve our understanding of these intermittent resources so we can manage them better. In the case of wind resources, large unscheduled changes in the energy output, called ramp events, make it challenging to keep the load and the generation balanced. In this report, we show that simple statistical analysis of the historical data on wind energy generation can provide insights into these ramp events. In particular, this analysis can help answer questions such as the time period during the day when these events are likely to occur, the relative severity of positive and negative ramps, and the frequency of their occurrence. As there are several ways in which ramp events can be defined and counted, we also conduct a detailed study comparing different options. Our results indicate that the statistics are relatively insensitive to these choices, but depend on utility-specific factors, such as the magnitude of the ramp and the time interval over which this change occurs. These factors reflect the challenges faced by schedulers and operators in keeping the load and generation balanced and can change over the years. We conduct our analysis using data from wind farms in the Tehachapi Pass region in Southern California and the Columbia Basin region in Northern Oregon; while the results for other regions are likely to be different, the report describes the benefits of conducting simple statistical analysis on wind generation data and the insights that can be gained through such analysis.

  18. Ontology-based time information representation of vaccine adverse events in VAERS for temporal analysis

    PubMed Central

    2012-01-01

    Background The U.S. FDA/CDC Vaccine Adverse Event Reporting System (VAERS) provides a valuable data source for post-vaccination adverse event analyses. The structured data in the system has been widely used, but the information in the write-up narratives is rarely included in these kinds of analyses. In fact, the unstructured nature of the narratives makes the data embedded in them difficult to be used for any further studies. Results We developed an ontology-based approach to represent the data in the narratives in a “machine-understandable” way, so that it can be easily queried and further analyzed. Our focus is the time aspect in the data for time trending analysis. The Time Event Ontology (TEO), Ontology of Adverse Events (OAE), and Vaccine Ontology (VO) are leveraged for the semantic representation of this purpose. A VAERS case report is presented as a use case for the ontological representations. The advantages of using our ontology-based Semantic web representation and data analysis are emphasized. Conclusions We believe that representing both the structured data and the data from write-up narratives in an integrated, unified, and “machine-understandable” way can improve research for vaccine safety analyses, causality assessments, and retrospective studies. PMID:23256916

  19. Efficacy and adverse events of cold vs hot polypectomy: A meta-analysis

    PubMed Central

    Fujiya, Mikihiro; Sato, Hiroki; Ueno, Nobuhiro; Sakatani, Aki; Tanaka, Kazuyuki; Dokoshi, Tatsuya; Fujibayashi, Shugo; Nomura, Yoshiki; Kashima, Shin; Gotoh, Takuma; Sasajima, Junpei; Moriichi, Kentaro; Watari, Jiro; Kohgo, Yutaka

    2016-01-01

    AIM: To compare previously reported randomized controlled studies (RCTs) of cold and hot polypectomy, we systematically reviewed and clarify the utility of cold polypectomy over hot with respect to efficacy and adverse events. METHODS: A meta-analysis was conducted to evaluate the predominance of cold and hot polypectomy for removing colon polyps. Published articles and abstracts from worldwide conferences were searched using the keywords “cold polypectomy”. RCTs that compared either or both the effects or adverse events of cold polypectomy with those of hot polypectomy were collected. The patients’ demographics, endoscopic procedures, No. of examined lesions, lesion size, macroscopic and histologic findings, rates of incomplete resection, bleeding amount, perforation, and length of procedure were extracted from each study. A forest plot analysis was used to verify the relative strength of the effects and adverse events of each procedure. A funnel plot was generated to assess the possibility of publication bias. RESULTS: Ultimately, six RCTs were selected. No significant differences were noted in the average lesion size (less than 10 mm) between the cold and hot polypectomy groups in each study. Further, the rates of complete resection and adverse events, including delayed bleeding, did not differ markedly between cold and hot polypectomy. The average procedural time in the cold polypectomy group was significantly shorter than in the hot polypectomy group. CONCLUSION: Cold polypectomy is a time-saving procedure for removing small polyps with markedly similar curability and safety to hot polypectomy. PMID:27340361

  20. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  1. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  2. Through the eyes of the other: using event analysis to build cultural competence.

    PubMed

    Kozub, Mary L

    2013-07-01

    Cultural competence requires more than the accumulation of information about cultural groups. An awareness of the nurse's own culture, beliefs, and values is considered by several transcultural nursing theorists to be essential to the development of cultural competence and the provision of quality patient care. Using Transformational Learning Theory, this article describes event analysis, an active learning tool that uses the nurse's own practice to explore multiple perspectives of an experience, with the goal of transforming the nurse's approach to diversity from an ethnocentric stance, to one of tolerance and consideration for the patient's needs, values, and beliefs with regard to quality of care. Furthermore, the application of the event analysis to multiple settings, including inpatient, educational, and administrative environments, is discussed. PMID:23545698

  3. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  4. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation

  5. Identification and Analysis of Storm Tracks Associated with Extreme Flood Events in Southeast and South Brazil

    NASA Astrophysics Data System (ADS)

    Lima, Carlos; Lopes, Camila

    2015-04-01

    Flood is the main natural disaster in Brazil, practically affecting all regions in the country and causing several economical damages and losses of lives. In traditional hydrology, the study of floods is focused on a frequency analysis of the extreme events and on the fit of statistical models to define flood quantiles associated with pre-specified return periods or exceedance probabilities. The basic assumptions are randomness and temporal stationarity of the streamflow data. In this paper we seek to advance the traditional flood frequency studies by using the ideas developed in the area of flood hydroclimatology, which is defined as the study of climate in the flood framework, i.e., the understanding of long term changes in the frequency, magnitude, duration, location and seasonality of floods as driven by the interaction of regional and global patterns of the ocean and atmospheric circulation. That being said, flood events are not treated as random and stationary but resulting from a causal chain, where exceptional floods in water basins from different sizes are related with large scale anomalies in the atmospheric and ocean circulation patterns. Hence, such studies enrich the classical assumption of stationary flood hazard adopted in most flood frequency studies through a formal consideration of the physical mechanisms responsible for the generation of extreme floods, which implies recognizing the natural climate variability due to persistent and oscillatory regimes (e.g. ENSO, NAO, PDO) in many temporal scales (interannual, decadal, etc), and climate fluctuations in response to anthropogenic changes in the atmosphere, soil use and vegetation cover. Under this framework and based on streamflow gauge and reanalysis data, we identify and analyze here the storm tracks that preceded extreme events of floods in key flood-prone regions of the country (e.g. Parana and Rio Doce River basins) with such events defined based on the magnitude, duration and volume of the

  6. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  7. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis. PMID:22549283

  8. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  9. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  10. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    PubMed Central

    Gao, Zhu-ye; Qiu, Yu; Jiao, Yang; Shang, Qing-hua; Shi, Da-zhuo

    2013-01-01

    Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD) and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG) accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537). The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986), age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53), and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057) were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM) therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036). Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events. PMID:23983773

  11. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  12. Cardiovascular Events Following Smoke-Free Legislations: An Updated Systematic Review and Meta-Analysis

    PubMed Central

    Jones, Miranda R.; Barnoya, Joaquin; Stranges, Saverio; Losonczy, Lia; Navas-Acien, Ana

    2014-01-01

    Background Legislations banning smoking in indoor public places and workplaces are being implemented worldwide to protect the population from secondhand smoke exposure. Several studies have reported reductions in hospitalizations for acute coronary events following the enactment of smoke-free laws. Objective We set out to conduct a systematic review and meta-analysis of epidemiologic studies examining how legislations that ban smoking in indoor public places impact the risk of acute coronary events. Methods We searched MEDLINE, EMBASE, and relevant bibliographies including previous systematic reviews for studies that evaluated changes in acute coronary events, following implementation of smoke-free legislations. Studies were identified through December 2013. We pooled relative risk (RR) estimates for acute coronary events comparing post- vs. pre-legislation using inverse-variance weighted random-effects models. Results Thirty-one studies providing estimates for 47 locations were included. The legislations were implemented between 1991 and 2010. Following the enactment of smoke-free legislations, there was a 12 % reduction in hospitalizations for acute coronary events (pooled RR: 0.88, 95 % CI: 0.85–0.90). Reductions were 14 % in locations that implemented comprehensive legislations compared to an 8 % reduction in locations that only had partial restrictions. In locations with reductions in smoking prevalence post-legislation above the mean (2.1 % reduction) there was a 14 % reduction in events compared to 10 % in locations below the mean. The RRs for acute coronary events associated with enacting smoke-free legislation were 0.87 vs. 0.89 in locations with smoking prevalence pre-legislation above and below the mean (23.1 %), and 0.87 vs. 0.89 in studies from the Americas vs. other regions. Conclusion The implementation of smoke-free legislations was related to reductions in acute coronary event hospitalizations in most populations evaluated. Benefits are greater

  13. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q‑ qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  14. Detailed chronological analysis of microevolution events in herds infected persistently by Mycobacterium bovis.

    PubMed

    Navarro, Yurena; Romero, Beatriz; Bouza, Emilio; Domínguez, Lucas; de Juan, Lucía; García-de-Viedma, Darío

    2016-02-01

    Various studies have analyzed microevolution events leading to the emergence of clonal variants in human infections by Mycobacterium tuberculosis. However, microevolution events in animal tuberculosis remain unknown. We performed a systematic analysis of microevolution events in eight herds that were chronically infected by Mycobacterium bovis for more than 12 months. We analyzed 88 animals using a systematic screening procedure based on discriminatory MIRU-VNTR genotyping at sequential time points during the infection. Microevolution was detected in half of the herds studied. Emergence of clonal variants did not require long infection periods or a high number of infected animals in the herd. Microevolution was not restricted to strains from specific spoligotypes, and the subtle variations detected involved different MIRU loci. The genetic locations of the subtle genotypic variations recorded in the clonal variants indicated potential functional significance. This finding was consistent with the dynamics of some clonal variants, which outcompeted the original strains, suggesting an advantageous phenotype. Our data constitute a first step in defining the thresholds of variability to be tolerated in molecular epidemiology studies of M. bovis. We could therefore ensure that related clonal variants emerging as a result of microevolution events are not going to be misinterpreted as unrelated isolates. PMID:26790941

  15. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    PubMed Central

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  16. Urbanization and Fertility: An Event-History Analysis of Coastal Ghana

    PubMed Central

    WHITE, MICHAEL J.; MUHIDIN, SALUT; ANDRZEJEWSKI, CATHERINE; TAGOE, EVA; KNIGHT, RODNEY; REED, HOLLY

    2008-01-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself. Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  17. Urbanization and fertility: an event-history analysis of coastal Ghana.

    PubMed

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  18. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  19. Mixed-effects Poisson regression analysis of adverse event reports: the relationship between antidepressants and suicide.

    PubMed

    Gibbons, Robert D; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K; Bhaumik, Dulal K; Brown, C Hendricks; Kapur, Kush; Marcus, Sue M; Hur, Kwan; Mann, J John

    2008-05-20

    A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)'s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  20. Application of satellite remote-sensing data for source analysis of fine particulate matter transport events.

    PubMed

    Engel-Cox, Jill A; Young, Gregory S; Hoff, Raymond M

    2005-09-01

    Satellite sensors have provided new datasets for monitoring regional and urban air quality. Satellite sensors provide comprehensive geospatial information on air quality with both qualitative imagery and quantitative data, such as aerosol optical depth. Yet there has been limited application of these new datasets in the study of air pollutant sources relevant to public policy. One promising approach to more directly link satellite sensor data to air quality policy is to integrate satellite sensor data with air quality parameters and models. This paper presents a visualization technique to integrate satellite sensor data, ground-based data, and back trajectory analysis relevant to a new rule concerning the transport of particulate matter across state boundaries. Overlaying satellite aerosol optical depth data and back trajectories in the days leading up to a known fine particulate matter with an aerodynamic diameter of <2.5 microm (PM2.5) event may indicate whether transport or local sources appear to be most responsible for high PM2.5 levels in a certain location at a certain time. Events in five cities in the United States are presented as case studies. This type of analysis can be used to help understand the source locations of pollutants during specific events and to support regulatory compliance decisions in cases of long distance transport. PMID:16259433

  1. Final Report for Dynamic Models for Causal Analysis of Panel Data. Approaches to the Censoring Problem in Analysis of Event Histories. Part III, Chapter 2.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…

  2. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events.

    PubMed

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic "GT-AG" rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  3. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  4. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events

    PubMed Central

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic “GT-AG” rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  5. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables

    PubMed Central

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an “event of relation” with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  6. Diagnostic analysis and spectral energetics of a blocking event in the GLAS climate model simulation

    NASA Technical Reports Server (NTRS)

    Chen, T.-C.; Shukla, J.

    1983-01-01

    A synoptic and spectral analysis of a blocking event is presented, with attention given to the temporal evolution, maintenance, and decay of the block. The GLAS numerical climate model was used to generate a blocking event by the introduction of SST anomalies. Wavenumbers 2 and 3 became stationary around their climatological locations, and their constructive interference produced persistent blocking ridges over the west coast of North America and the other over western Europe. Time variations of the kinetic and potential energies and energy conversions during the blocking were performed. Spectrally filtered Hovmoller diagrams were developed for the winter of 1976-77, and showed that long waves were stationary over most of the interval, which featured severe weather conditions.

  7. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  8. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables.

    PubMed

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an "event of relation" with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  9. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    NASA Astrophysics Data System (ADS)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  10. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26997353

  11. Computational analysis reveals a correlation of exon-skipping events with splicing, transcription and epigenetic factors.

    PubMed

    Ye, Zhenqing; Chen, Zhong; Lan, Xun; Hara, Stephen; Sunkel, Benjamin; Huang, Tim H-M; Elnitski, Laura; Wang, Qianben; Jin, Victor X

    2014-03-01

    Alternative splicing (AS), in higher eukaryotes, is one of the mechanisms of post-transcriptional regulation that generate multiple transcripts from the same gene. One particular mode of AS is the skipping event where an exon may be alternatively excluded or constitutively included in the resulting mature mRNA. Both transcript isoforms from this skipping event site, i.e. in which the exon is either included (inclusion isoform) or excluded (skipping isoform), are typically present in one cell, and maintain a subtle balance that is vital to cellular function and dynamics. However, how the prevailing conditions dictate which isoform is expressed and what biological factors might influence the regulation of this process remain areas requiring further exploration. In this study, we have developed a novel computational method, graph-based exon-skipping scanner (GESS), for de novo detection of skipping event sites from raw RNA-seq reads without prior knowledge of gene annotations, as well as for determining the dominant isoform generated from such sites. We have applied our method to publicly available RNA-seq data in GM12878 and K562 cells from the ENCODE consortium and experimentally validated several skipping site predictions by RT-PCR. Furthermore, we integrated other sequencing-based genomic data to investigate the impact of splicing activities, transcription factors (TFs) and epigenetic histone modifications on splicing outcomes. Our computational analysis found that splice sites within the skipping-isoform-dominated group (SIDG) tended to exhibit weaker MaxEntScan-calculated splice site strength around middle, 'skipping', exons compared to those in the inclusion-isoform-dominated group (IIDG). We further showed the positional preference pattern of splicing factors, characterized by enrichment in the intronic splice sites immediately bordering middle exons. Finally, our analysis suggested that different epigenetic factors may introduce a variable obstacle in the

  12. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    SciTech Connect

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. )

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  13. Uncertainty Analysis of Climate Change Impact on Extreme Rainfall Events in the Apalachicola River Basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, D.; Hagen, S.; Bacopoulos, P.

    2011-12-01

    Climate change impact on the rainfall patterns during the summer season (May -- August) at the Apalachicola River basin (Florida Panhandle coast) is assessed using ensemble regional climate models (RCMs). Rainfall data for both baseline and future years (30-year periods) are obtained from North American Regional Climate Change Assessment Program (NARCCAP) where the A2 emission scenario is used. Trend analysis is conducted based on historical rainfall data from three weather stations. Two methods are used to assess the climate change impact on the rainfall intensity-duration-frequency (IDF) curves, i.e., maximum intensity percentile-based method and sequential bias correction and maximum intensity percentile-based method. As a preliminary result from one RCM, extreme rainfall intensity is found to increase significantly with the increase in rainfall intensity increasing more dramatically with closer proximity to the coast. The projected rainfall pattern changes (spatial and temporal, mean and extreme values) provide guidance for developing adaptation and mitigation strategies on water resources management and ecosystem protections. More rainfall events move from July to June during future years for all three stations; in the upstream, the variability of time occurrence of extreme rainfall increases and more extreme events are shown to occur in June and August instead of May. These temporal shifts of extreme rainfall events will increase the probability of simultaneous heavy rainfall in the downstream and upstream in June during which flooding will be enhanced. The uncertainty analysis on the climate change impact on extreme rainfall events will be presented based on the simulations from the ensemble of RCMs.

  14. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  15. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  16. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model. PMID:24194116

  17. An event-related analysis of P300 by simultaneous EEG/fMRI

    NASA Astrophysics Data System (ADS)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  18. Recent developments in methods of chemical analysis in investigations of firearm-related events.

    PubMed

    Zeichner, Arie

    2003-08-01

    A review of recent (approximately the last ten years) developments in the methods used for chemical analysis in investigations of firearm-related events is provided. This review discusses:examination of gunshot (primer) residues (GSR) and gunpowder (propellant) residues on suspects and their clothing;detection of firearm imprints on the hands of suspects;identification of the bullet entry holes and estimation of shooting distance;linking weapons and/or fired ammunition to the gunshot entries, and estimation of the time since discharge. PMID:12811451

  19. Using Fluctuation Analysis to Establish Causal Relations between Cellular Events without Experimental Perturbation

    PubMed Central

    Welf, Erik S.; Danuser, Gaudenz

    2014-01-01

    Experimental perturbations are commonly used to establish causal relationships between the molecular components of a pathway and their cellular functions; however, this approach suffers inherent limitations. Especially in pathways with a significant level of nonlinearity and redundancy among components, such perturbations induce compensatory responses that obscure the actual function of the targeted component in the unperturbed pathway. A complementary approach uses constitutive fluctuations in component activities to identify the hierarchy of information flow through pathways. Here, we review the motivation for using perturbation-free approaches and highlight recent advances made in using perturbation-free fluctuation analysis as a means to establish causality among cellular events. PMID:25468328

  20. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event.

    PubMed

    Bogaard, Glynis; Meijer, Ewout H; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  1. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event

    PubMed Central

    Bogaard, Glynis; Meijer, Ewout H.; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  2. High fold computer disk storage DATABASE for fast extended analysis of γ-rays events

    NASA Astrophysics Data System (ADS)

    Stézowski, O.; Finck, Ch.; Prévost, D.

    1999-03-01

    Recently spectacular technical developments have been achieved to increase the resolving power of large γ-ray spectrometers. With these new eyes, physicists are able to study the intricate nature of atomic nuclei. Concurrently more and more complex multidimensional analyses are needed to investigate very weak phenomena. In this article, we first present a software (DATABASE) allowing high fold coincidences γ-rays events to be stored on hard disk. Then, a non-conventional method of analysis, anti-gating procedure, is described. Two physical examples are given to explain how it can be used and Monte Carlo simulations have been performed to test the validity of this method.

  3. Recurrent event data analysis with intermittently observed time-varying covariates.

    PubMed

    Li, Shanshan; Sun, Yifei; Huang, Chiung-Yu; Follmann, Dean A; Krause, Richard

    2016-08-15

    Although recurrent event data analysis is a rapidly evolving area of research, rigorous studies on estimation of the effects of intermittently observed time-varying covariates on the risk of recurrent events have been lacking. Existing methods for analyzing recurrent event data usually require that the covariate processes are observed throughout the entire follow-up period. However, covariates are often observed periodically rather than continuously. We propose a novel semiparametric estimator for the regression parameters in the popular proportional rate model. The proposed estimator is based on an estimated score function where we kernel smooth the mean covariate process. We show that the proposed semiparametric estimator is asymptotically unbiased, normally distributed, and derives the asymptotic variance. Simulation studies are conducted to compare the performance of the proposed estimator and the simple methods carrying forward the last covariates. The different methods are applied to an observational study designed to assess the effect of group A streptococcus on pharyngitis among school children in India. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26887664

  4. Analysis of recurrent events with an associated informative dropout time: Application of the joint frailty model.

    PubMed

    Rogers, Jennifer K; Yaroshinsky, Alex; Pocock, Stuart J; Stokar, David; Pogoda, Janice

    2016-06-15

    This paper considers the analysis of a repeat event outcome in clinical trials of chronic diseases in the context of dependent censoring (e.g. mortality). It has particular application in the context of recurrent heart failure hospitalisations in trials of heart failure. Semi-parametric joint frailty models (JFMs) simultaneously analyse recurrent heart failure hospitalisations and time to cardiovascular death, estimating distinct hazard ratios whilst individual-specific latent variables induce associations between the two processes. A simulation study was carried out to assess the suitability of the JFM versus marginal analyses of recurrent events and cardiovascular death using standard methods. Hazard ratios were consistently overestimated when marginal models were used, whilst the JFM produced good, well-estimated results. An application to the Candesartan in Heart failure: Assessment of Reduction in Mortality and morbidity programme was considered. The JFM gave unbiased estimates of treatment effects in the presence of dependent censoring. We advocate the use of the JFM for future trials that consider recurrent events as the primary outcome. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:26751714

  5. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  6. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  7. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    SciTech Connect

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  8. Analysis of phreatic events at Ruapehu volcano, New Zealand using a new SOM approach

    NASA Astrophysics Data System (ADS)

    Carniel, Roberto; Jolly, Arthur D.; Barbui, Luca

    2013-03-01

    We apply Self-Organising Maps (SOM) to assess the low level seismic activity prior to small scale phreatic events at Ruapehu volcano New Zealand. The SOM approach allows an automatic pattern recognition, virtually independent from a priori knowledge. Volcanic tremor spectra are randomly presented to the network in a competitive iterative training process, followed by a hierarchical clusterization of the SOM nodes. Spectra are then projected, ordered by time, to clusters on the map. A coherent time evolution of the data through the clusters can highlight the existence of different regimes and the transitions between them. Two Ruapehu events were examined: a phreatic event on 4 October 2006 which displaced the crater lake producing a 4 m high wave on the lake edge, and the more energetic 25 September 2007 phreatic eruption. The SOM analysis provides a classification of tremor spectral patterns that clusters into three regimes that we label by colours. The pattern for both eruptions is consistent with a pre-eruption spectral pattern including enhanced spectral energy in the range of 4 to 6 Hz — labelled 'green tremor'. This gives way to spectra having broader energy between 2 and 6 Hz, the so called 'red tremor' just prior to the eruption. The post eruption pattern includes spectral peaks at generally lower frequencies of 2 to 4 Hz — the so called 'blue tremor'. Clusterization into only three groups yields highly non-unique solutions which cannot explain the variety of processes operating at Ruapehu over long time periods. Regardless, the approach highlights noteworthy similarities that may be explained by a pattern of slow pressurisation under a hydrothermal or magmatic seal - 'green' - followed by seal failure - 'red' - and subsequent de-pressurisation - 'blue' - for the two events studied. Although the application shown here is limited, we think it demonstrates the power of this classification approach.

  9. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    NASA Astrophysics Data System (ADS)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  10. Post Tyrrhénian deformation analysis in the Sahel coast (Eastern Tunisia): seismotectonic events implication

    NASA Astrophysics Data System (ADS)

    Mejrei, H.; Ghribi, R.; Bouaziz, S.; Balescu, S.

    2012-04-01

    The eastern coast of Tunisia is characterized by Pleistocene coastal deposits considered as a reference of interglacial high sea levels. In this region, the stratigraphy of Tunisian Pleistocene deposits was first established on the basis of geomorphological, lithostratigraphic, biostratigraphic criteria and U/Th data. They have been subdivided into three superimposed formations, from the oldest to the recent "Douira, Rejiche and Chebba" including coastal marine (Strombus bubonius), lagoonal and eolian sediments. These marine formations are organized into parallel bars to the actual shoreline overlaying unconformably the Mio-Pliocene and "Villafranchian" deposits. A luminescence dating method IRSL applied to alkali feldspar grains from the two sandy marines units of the Douira formation demonstrate for the first time the presence of two successive interglacial high sea level events correlative of MIS 7 and MIS 9. These sandy marine units are separated by a major erosional surface and by a continental pedogenised loamy deposit related to a low sea level event which might be assigned to MIS 8. Variations in the height of these marine unit (+13 to +32m) in the Sahel coast reflect a significant tectonic deformations and show precious geomorphological and tectonic markers. An extensive brittle deformations analysis has been carried out in several sites. A detailed analysis of fracturing is based on studies of fault-slip data population and of joint sets. It allows reconstructions of post Tyrrhenian stress regimes which are characterized by N170-016 compression and N095-100 extension. In this paper we present, the combination of IRSL data applied to these raised marine deposits and a reconstruction of tectonic evolution in term of stress pattern evolution since the Tyrrhenian allowed us to assign an accurate the recent tectonic calendar. These reconstituted events will be replaced and will be discussed in the regional setting of sismotectonic activities of the north