Science.gov

Sample records for event analysis atheana

  1. The action characterization matrix: A link between HERA (Human Events Reference for ATHEANA) and ATHEANA (a technique for human error analysis)

    SciTech Connect

    Hahn, H.A.

    1997-12-22

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavior science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. ATHEANA is being developed in the context of nuclear power plant (NPP) PRAs, and much of the language used to describe the method and provide examples of its application are specific to that industry. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. Los Alamos National Laboratory`s (LANL) Human Factors Group has recently joined the ATHEANA project team; LANL is responsible for further developing the database structure and for analyzing additional exemplar operational events for entry into the database. The Action Characterization Matrix (ACM) is conceived as a bridge between the HERA database structure and ATHEANA. Specifically, the ACM allows each unsafe action or human failure event to be characterized according to its representation along each of six different dimensions: system status, initiator status, unsafe action mechanism, information processing stage, equipment/material conditions, and performance shaping factors. This report describes the development of the ACM and provides details on the structure and content of its dimensions.

  2. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    SciTech Connect

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  3. Technical Basis and Implementation Guidelines for a Technique for Human Event Analysis (ATHEANA)

    DTIC Science & Technology

    2000-05-01

    retrospective analyses are powerful tools in illustrating and explaining ATHEANA principles and concepts. Also, the ATHEANA approach for retrospective...the principle of confirmation bias. Once a hypothesis is generated to explain a set of findings, new findings are likely to be explained in terms of...Experience Illustrating ATHEANA Principles each act. Finally, Figure 5.2c describes the dependencies among the four acts. These dependencies explain why

  4. Quantification results from an application of a new technique for human event analysis (ATHEANA) at a pressurized water reactor

    SciTech Connect

    Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1998-05-01

    This paper presents results from the quantification of the three human failure events (HFEs) identified using the ATHEANA methodology as discussed in an earlier companion paper presented at this conference. Sections describe the quantification task, important basic events, and the results obtained from quantifying the three HFEs that were identified -- the first two of which were simulated at the Seabrook Station Simulator.

  5. Human Events Reference for ATHEANA (HERA) Database Description and Preliminary User's Manual

    SciTech Connect

    Auflick, J.L.

    1999-08-12

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database (db) of analytical operational events, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  6. Human events reference for ATHEANA (HERA) database description and preliminary user`s manual

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Pond, D.J.

    1998-05-27

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  7. Discussion of Comments from a Peer Review of A Technique for Human Event Anlysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A,; Wreathall J.

    1999-01-28

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of draft NUREG- 1624, a peer review of the new human reliability analysis method its documentation and the results of an initial test of the method was held over a two-day period in June 1998 in Seattle, Washington. Four internationally known and respected experts in HK4 or probabilistic risk assessment were selected to serve as the peer reviewers. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. They were asked to focus on two major aspects: Are the basic premises of ATHEANA on solid ground and is the conceptual basis adequate? Is the ATHEANA implementation process adequate given the description of the intended users in the documentation? The four peer reviewers asked questions and provided oral comments during the peer review meeting and provided written comments approximately two weeks after the completion of the meeting. This paper discusses their major comments.

  8. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  9. Results of a nuclear power plant Application of a new technique for human error analysis (ATHEANA)

    SciTech Connect

    Forester, J.A.; Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1997-10-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the {open_quotes}success{close_quotes} of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator {open_quotes}on shift{close_quotes} until a few months before the demonstration. The demonstration was conducted over a 5 month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  10. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Whitehead, D.W.; Forester, J.A.; Bley, D.C.

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  11. Philosophy of ATHEANA

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A.; Thompson, C.M.; Whitehead, D.W.; Wreathall, J.

    1999-03-24

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  12. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  13. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    SciTech Connect

    Taylor, J.H.; Luckas, W.J.; Wreathall, J.

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  14. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    SciTech Connect

    Taylor, J.; O`Hara, J.; Luckas, W.

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  15. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  16. EVENT PLANNING USING FUNCTION ANALYSIS

    SciTech Connect

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  17. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  18. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.

  20. Joint Attributes and Event Analysis for Multimedia Event Detection.

    PubMed

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  1. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  2. Interpretation Analysis as a Competitive Event.

    ERIC Educational Resources Information Center

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  3. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  4. Dynamic Event Tree Analysis Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  5. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  6. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  7. Analysis of a Limb Eruptive Event

    NASA Astrophysics Data System (ADS)

    Kotrč, P. Kupryakov, Yu. A.; Bárta, M.; Kashapova, K., L.; Liu, W.

    2016-04-01

    We present the analysis of an eruptive event that took place on the eastern limb on April 21, 2015, which was observed by the Ondřejov horizontal telescope and spectrograph. The eruption of the highly twisted prominence was followed by the onset of soft X-ray sources. We identified the structures observed in Hα spectra with the details on the Hα filtergrams and analyzed the evolution of Doppler component velocities. The timing and observed characteristics of the eruption were compared with the prediction of the model based on the twisting of the flux ropes and the kink/torus instability.

  8. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  9. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  10. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  11. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2017-08-29

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  12. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  13. Modelling recurrent events: a tutorial for analysis in epidemiology

    PubMed Central

    Amorim, Leila DAF; Cai, Jianwen

    2015-01-01

    In many biomedical studies, the event of interest can occur more than once in a participant. These events are termed recurrent events. However, the majority of analyses focus only on time to the first event, ignoring the subsequent events. Several statistical models have been proposed for analysing multiple events. In this paper we explore and illustrate several modelling techniques for analysis of recurrent time-to-event data, including conditional models for multivariate survival data (AG, PWP-TT and PWP-GT), marginal means/rates models, frailty and multi-state models. We also provide a tutorial for analysing such type of data, with three widely used statistical software programmes. Different approaches and software are illustrated using data from a bladder cancer project and from a study on lower respiratory tract infection in children in Brazil. Finally, we make recommendations for modelling strategy selection for analysis of recurrent event data. PMID:25501468

  14. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  15. An Introduction to Event History Analysis.

    ERIC Educational Resources Information Center

    Kolstad, Andrew

    The theory of stochastic processes deals with systems that develop over time in accordance with probabilistic laws. The basic concepts involved in two types of continuous-time processes are the idea of a constant probability of occurrence in the point event process and the extensions necessary for the discrete state process. The required…

  16. Predicting analysis time in event-driven clinical trials with event-reporting lag.

    PubMed

    Wang, Jianming; Ke, Chunlei; Jiang, Qi; Zhang, Charlie; Snapinn, Steven

    2012-04-30

    For a clinical trial with a time-to-event primary endpoint, the rate of accrual of the event of interest determines the timing of the analysis, upon which significant resources and strategic planning depend. It is important to be able to predict the analysis time early and accurately. Currently available methods use either parametric or nonparametric models to predict the analysis time based on accumulating information about enrollment, event, and study withdrawal rates and implicitly assume that the available data are completely reported at the time of performing the prediction. This assumption, however, may not be true when it takes a certain amount of time (i.e., event-reporting lag) for an event to be reported, in which case, the data are incomplete for prediction. Ignoring the event-reporting lag could substantially impact the accuracy of the prediction. In this paper, we describe a general parametric model to incorporate event-reporting lag into analysis time prediction. We develop a prediction procedure using a Bayesian method and provide detailed implementations for exponential distributions. Some simulations were performed to evaluate the performance of the proposed method. An application to an on-going clinical trial is also described. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  18. Web Video Event Recognition by Semantic Analysis from Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng

    2016-09-27

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyse video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of web video event recognition, where web videos often describe largegranular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video datasets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of web video event recognition.

  19. [Analysis of Spontaneously Reported Adverse Events].

    PubMed

    Nakamura, Mitsuhiro

    2016-01-01

    Observational study is necessary for the evaluation of drug effectiveness in clinical practice. In recent years, the use of spontaneous reporting systems (SRS) for adverse drug reactions has increased and they have become an important resource for regulatory science. SRS, being the largest and most well-known databases worldwide, are one of the primary tools used for postmarketing surveillance and pharmacovigilance. To analyze SRS, the US Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report Database (JADER) are reviewed. Authorized pharmacovigilance algorithms were used for signal detection, including the reporting odds ratio. An SRS is a passive reporting database and is therefore subject to numerous sources of selection bias, including overreporting, underreporting, and a lack of a denominator. Despite the inherent limitations of spontaneous reporting, SRS databases are a rich resource and data mining index that provide powerful means of identifying potential associations between drugs and their adverse effects. Our results, which are based on the evaluation of SRS databases, provide essential knowledge that could improve our understanding of clinical issues.

  20. Peak event analysis: a novel empirical method for the evaluation of elevated particulate events

    PubMed Central

    2013-01-01

    Background We report on a novel approach to the analysis of suspended particulate data in a rural setting in southern Ontario. Analyses of suspended particulate matter and associated air quality standards have conventionally focussed on 24-hour mean levels of total suspended particulates (TSP) and particulate matter <10 microns, <2.5 microns and <1 micron in diameter (PM10, PM2.5, PM1, respectively). Less emphasis has been placed on brief peaks in suspended particulate levels, which may pose a substantial nuisance, irritant, or health hazard. These events may also represent a common cause of public complaint and concern regarding air quality. Methods Measurements of TSP, PM10, PM2.5, and PM1 levels were taken using an automated device following local complaints of dusty conditions in rural south-central Ontario, Canada. The data consisted of 126,051 by-minute TSP, PM10, PM2.5, and PM1 measurements between May and August 2012. Two analyses were performed and compared. First, conventional descriptive statistics were computed by month for TSP, PM10, PM2.5, and PM1, including mean values and percentiles (70th, 90th, and 95th). Second, a novel graphical analysis method, using density curves and line plots, was conducted to examine peak events occurring at or above the 99th percentile of per-minute TSP readings. We refer to this method as “peak event analysis”. Findings of the novel method were compared with findings from the conventional approach. Results Conventional analyses revealed that mean levels of all categories of suspended particulates and suspended particulate diameter ratios conformed to existing air quality standards. Our novel methodology revealed extreme outlier events above the 99th percentile of readings, with peak PM10 and TSP levels over 20 and 100 times higher than the respective mean values. Peak event analysis revealed and described rare and extreme peak dust events that would not have been detected using conventional descriptive statistics

  1. Event/Time/Availability/Reliability-Analysis Program

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas

    1994-01-01

    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  2. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  3. PLANETARY AND OTHER SHORT BINARY MICROLENSING EVENTS FROM THE MOA SHORT-EVENT ANALYSIS

    SciTech Connect

    Bennett, D. P.; Sumi, T.; Bond, I. A.; Ling, C. H.; Kamiya, K.; Abe, F.; Fukui, A.; Furusawa, K.; Itow, Y.; Masuda, K.; Matsubara, Y.; Miyake, N.; Muraki, Y.; Botzler, C. S.; Rattenbury, N. J.; Korpela, A. V.; Sullivan, D. J.; Kilmartin, P. M.; Ohnishi, K.; Saito, To.; Collaboration: MOA Collaboration; and others

    2012-10-01

    We present the analysis of four candidate short-duration binary microlensing events from the 2006-2007 MOA Project short-event analysis. These events were discovered as a by-product of an analysis designed to find short-timescale single-lens events that may be due to free-floating planets. Three of these events are determined to be microlensing events, while the fourth is most likely caused by stellar variability. For each of the three microlensing events, the signal is almost entirely due to a brief caustic feature with little or no lensing attributable mainly to the lens primary. One of these events, MOA-bin-1, is due to a planet, and it is the first example of a planetary event in which the stellar host is only detected through binary microlensing effects. The mass ratio and separation are q (4.9 {+-} 1.4) Multiplication-Sign 10{sup -3} and s = 2.10 {+-} 0.05, respectively. A Bayesian analysis based on a standard Galactic model indicates that the planet, MOA-bin-1Lb, has a mass of m{sub p} = 3.7 {+-} 2.1 M{sub Jup} and orbits a star of M{sub *} = 0.75{sub -0.41}{sup +}0{sup .33} M{sub Sun} at a semimajor axis of a = 8.3{sub -2.7}{sup +4.5} AU. This is one of the most massive and widest separation planets found by microlensing. The scarcity of such wide-separation planets also has implications for interpretation of the isolated planetary mass objects found by this analysis. If we assume that we have been able to detect wide-separation planets with an efficiency at least as high as that for isolated planets, then we can set limits on the distribution of planets in wide orbits. In particular, if the entire isolated planet sample found by Sumi et al. consists of planets bound in wide orbits around stars, we find that it is likely that the median orbital semimajor axis is >30 AU.

  4. Video analysis of motor events in REM sleep behavior disorder.

    PubMed

    Frauscher, Birgit; Gschliesser, Viola; Brandauer, Elisabeth; Ulmer, Hanno; Peralta, Cecilia M; Müller, Jörg; Poewe, Werner; Högl, Birgit

    2007-07-30

    In REM sleep behavior disorder (RBD), several studies focused on electromyographic characterization of motor activity, whereas video analysis has remained more general. The aim of this study was to undertake a detailed and systematic video analysis. Nine polysomnographic records from 5 Parkinson patients with RBD were analyzed and compared with sex- and age-matched controls. Each motor event in the video during REM sleep was classified according to duration, type of movement, and topographical distribution. In RBD, a mean of 54 +/- 23.2 events/10 minutes of REM sleep (total 1392) were identified and visually analyzed. Seventy-five percent of all motor events lasted <2 seconds. Of these events, 1,155 (83.0%) were classified as elementary, 188 (13.5%) as complex behaviors, 50 (3.6%) as violent, and 146 (10.5%) as vocalizations. In the control group, 3.6 +/- 2.3 events/10 minutes (total 264) of predominantly elementary simple character (n = 240, 90.9%) were identified. Number and types of motor events differed significantly between patients and controls (P < 0.05). This study shows a very high number and great variety of motor events during REM sleep in symptomatic RBD. However, most motor events are minor, and violent episodes represent only a small fraction. Copyright 2007 Movement Disorder Society

  5. Glaciological parameters of disruptive event analysis

    SciTech Connect

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated.

  6. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  7. Median Analysis of Repeated Measures Associated with Recurrent Events in Presence of Terminal Event.

    PubMed

    Sundaram, Rajeshwari; Ma, Ling; Ghoshal, Subhashis

    2017-04-28

    Recurrent events are often encountered in medical follow up studies. In addition, such recurrences have other quantities associated with them that are of considerable interest, for instance medical costs of the repeated hospitalizations and tumor size in cancer recurrences. These processes can be viewed as point processes, i.e. processes with arbitrary positive jump at each recurrence. An analysis of the mean function for such point processes have been proposed in the literature. However, such point processes are often skewed, leading to median as a more appropriate measure than the mean. Furthermore, the analysis of recurrent event data is often complicated by the presence of death. We propose a semiparametric model for assessing the effect of covariates on the quantiles of the point processes. We investigate both the finite sample as well as the large sample properties of the proposed estimators. We conclude with a real data analysis of the medical cost associated with the treatment of ovarian cancer.

  8. Contingency Horizon: on Private Events and the Analysis of Behavior.

    PubMed

    Leigland, Sam

    2014-05-01

    Skinner's radical behaviorism incorporates private events as biologically based phenomena that may play a functional role with respect to other (overt) behavioral phenomena. Skinner proposed four types of contingencies, here collectively termed the contingency horizon, which enable certain functional relations between private events and verbal behavior. The adequacy and necessity of this position has met renewed challenges from Rachlin's teleological behaviorism and Baum's molar behaviorism, both of which argue that all "mental" phenomena and terminology may be explained by overt behavior and environment-behavior contingencies extended in time. A number of lines of evidence are presented in making a case for the functional characteristics of private events, including published research from behavior analysis and general experimental psychology, as well as verbal behavior from a participant in the debate. An integrated perspective is offered that involves a multiscaled analysis of interacting public behaviors and private events.

  9. Bibliometric Analysis of Medication Errors and Adverse Drug Events Studies.

    PubMed

    Huang, Hung-Chi; Wang, Cheng-Hua; Chen, Pi-Ching; Lee, Yen-Der

    2015-07-31

    Medication errors and adverse drug events are a key concern of the health-care industry. The objectives of this study were to map the intellectual structure of the studies of medication errors and adverse drug events and to investigate the developing path of this literature and interrelationships among the main topics. The Web of Science database was searched for documentation of medication errors and adverse drug events from 1961 to 2013. The most cited articles and references were profiled and analyzed using HistCite software to draw a historiograph and Ucinet software to draw a sociogram. The database search revealed 3343 medication errors and 3342 adverse drug event documents. The most cited articles on medication errors focused on 3 key themes from 1961 to 2013, namely, medication errors in adult inpatients, computerized physician order entry in medication error studies, and medication errors in pediatric inpatients. The developing path for the most cited articles about adverse drug events from 1987 to 2013 was as follows: detection, analysis, effect, and prevention from adult inpatient to pediatric inpatient settings and from hospitalized care to ambulatory care. In addition, social network analysis based on the most cited references revealed a close relationship between medication errors and adverse drug events. The mapping results provide a valuable tool for researchers to access the literature in this field and can be used to help identify the direction of medication errors and adverse drug events research.

  10. Nonlinear Analysis for Event Forewarning (NLAfEW)

    SciTech Connect

    Hively, Lee Mizener

    2013-05-23

    The NLAfEW computer code analyses noisy, experimental data to forewarn of adverse events. The functionality of the analysis is a follows: It removes artifacts from the data, converts the continuous data value to discrete values, constructs time-delay embedding vectors, comparents the unique nodes and links in one graph, and determines event forewarning on the basis of several successive occurrences of one (or more) of the dissimilarity measures above a threshold.

  11. Life events and psychosis: a review and meta-analysis.

    PubMed

    Beards, Stephanie; Gayer-Anderson, Charlotte; Borges, Susana; Dewey, Michael E; Fisher, Helen L; Morgan, Craig

    2013-07-01

    Recent models of psychosis implicate stressful events in its etiology. However, while evidence has accumulated for childhood trauma, the role of adult life events has received less attention. Therefore, a review of the existing literature on the relationship between life events and onset of psychotic disorder/experiences is timely. A search was conducted using PsychInfo, Medline, Embase, and Web of Science to identify studies of life events and the onset of psychosis or psychotic experiences within the general population. Given previous methodological concerns, this review included a novel quality assessment tool and focused on findings from the most robust studies. A meta-analysis was performed on a subgroup of 13 studies. Sixteen studies published between 1968 and 2012 were included. Of these, 14 reported positive associations between exposure to adult life events and subsequent onset of psychotic disorder/experiences. The meta-analysis yielded an overall weighted OR of 3.19 (95% CI 2.15-4.75). However, many studies were limited by small sample sizes and the use of checklist measures of life events, with no consideration of contextual influences on the meaning and interpretation of events. Few studies have assessed the role of adult life events in the onset of psychosis. There was some evidence that reported exposure to adult life events was associated with increased risk of psychotic disorder and subclinical psychotic experiences. However, the methodological quality of the majority of studies was low, which urges caution in interpreting the results and points toward a need for more methodologically robust studies.

  12. Analysis of damaging hydrogeological events in a Mediterranean region (Calabria)

    NASA Astrophysics Data System (ADS)

    Aceto, Luigi; Caloiero, Tommaso; Pasqua, A. A.; Petrucci, Olga

    2016-10-01

    Damaging Hydrogeological Events (DHEs) are periods of severe weather conditions affecting wide areas for several days, and causing mainly damaging landslides and floods. In order to characterise the DHEs, we analysed the historical series of the events that affected a Mediterranean region (Calabria, southern Italy) throughout 92 years of observation. Depending on their magnitude, we classified the events as: major catastrophic, catastrophic, extraordinary and ordinary. In winter events, damaged areas and damage were greater than those resulting from the other seasons. Nevertheless, the majority of the events took place in autumn, when, in addition to landslides, a relevant percentage of flash floods and floods also occurred. Results also show that the frequency of major catastrophic and catastrophic events has decreased since 1971, and that, in recent decades, Calabria has suffered from damaging effects even though rain did not reached extreme characteristics. In fact, the duration of triggering rain, the maximum daily rain of the events and the out coming frequency of the high return period of rain show a decreasing pattern throughout the study period. As to what concerns the damaging phenomena, landslides were identified as the most frequent in every season and in every type of events, the eastern side of the region was the most frequently and heavily damaged. According to literature, the trend of number of victims per event is also decreasing. The proposed analysis can be applied to different study areas in order to assess the relative magnitude of DHEs and their evolution throughout the years. The classification criterion can be useful to compare different events for either scientific or insurance purposes, and to identify the typical rainfall-damage scenario of a study area.

  13. External events analysis for the Savannah River Site K reactor

    SciTech Connect

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{sup {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.

  14. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  15. Hydrometeorological Analysis of Flooding Events in San Antonio, TX

    NASA Astrophysics Data System (ADS)

    Chintalapudi, S.; Sharif, H.; Elhassan, A.

    2008-12-01

    South Central Texas is particularly vulnerable to floods due to: proximity to a moist air source (the Gulf of Mexico); the Balcones Escarpment, which concentrates rainfall runoff; a tendency for synoptic scale features to become cut-off and stall over the area; and decaying tropical cyclones stalling over the area. The San Antonio Metropolitan Area is the 7th largest city in the nation, one of the most flash-flood prone regions in North America, and has experienced a number of flooding events in the last decade (1998, 2002, 2004, and 2007). Research is being conducted to characterize the meteorological conditions that lead to these events and apply the rainfall and watershed characteristics data to recreate the runoff events using a two- dimensional, physically-based, distributed-parameter hydrologic model. The physically based, distributed-parameter Gridded Surface Subsurface Hydrologic Analysis (GSSHA) hydrological model was used for simulating the watershed response to these storm events. Finally observed discharges were compared to GSSHA model discharges for these storm events. Analysis of the some of these events will be presented.

  16. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture.

  17. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  18. Predicting Retention of Drug Court Participants Using Event History Analysis

    ERIC Educational Resources Information Center

    Wolf, Elaine M.; Sowards, Kathryn A.; Wolf, Douglas A.

    2003-01-01

    This paper presents the results of a discrete-time event-history analysis of the relationships between client and program characteristics and the length and outcome of participation in a drug court program. We identify factors associated with both successful completion and premature termination. Having an African-American case manager, being…

  19. Predicting Retention of Drug Court Participants Using Event History Analysis

    ERIC Educational Resources Information Center

    Wolf, Elaine M.; Sowards, Kathryn A.; Wolf, Douglas A.

    2003-01-01

    This paper presents the results of a discrete-time event-history analysis of the relationships between client and program characteristics and the length and outcome of participation in a drug court program. We identify factors associated with both successful completion and premature termination. Having an African-American case manager, being…

  20. Service-Learning and Graduation: Evidence from Event History Analysis

    ERIC Educational Resources Information Center

    Yue, Hongtao; Hart, Steven M.

    2017-01-01

    This research employed Event History Analysis to understand how service-learning participation is related to students' graduation within six years. The longitudinal dataset includes 31,074 new undergraduate students who enrolled in a large western U.S. public university from Fall 2002 to Fall 2009. The study revealed that service-learning…

  1. Offense Specialization of Arrestees: An Event History Analysis

    ERIC Educational Resources Information Center

    Lo, Celia C.; Kim, Young S.; Cheng, Tyrone C.

    2008-01-01

    The data set employed in the present study came from interviews with arrestees conducted between 1999 and 2001 as well as from their official arrest records obtained from jail administrators. A total of 238 arrestees ages 18 to 25 constituted the final sample. Event history analysis examined each arrestee's movement from periods of no arrests to…

  2. Content Analysis of Presidential Debates as Communication Events.

    ERIC Educational Resources Information Center

    Jackson-Beeck, Marilyn; Meadow, Robert G.

    This paper argues for the inclusion of content analysis in communication research designs and for the utility of this methodology in the examination of communication events. The first half of the paper describes four types of communication content that can be analyzed: unintentional messages, such as the verbal imagery used; unconscious speech,…

  3. Analysis of recurrent event data with incomplete observation gaps.

    PubMed

    Kim, Yang-Jin; Jhun, Myoungshic

    2008-03-30

    In analysis of recurrent event data, recurrent events are not completely experienced when the terminating event occurs before the end of a study. To make valid inference of recurrent events, several methods have been suggested for accommodating the terminating event (Statist. Med. 1997; 16:911-924; Biometrics 2000; 56:554-562). In this paper, our interest is to consider a particular situation, where intermittent dropouts result in observation gaps during which no recurrent events are observed. In this situation, risk status varies over time and the usual definition of risk variable is not applicable. In particular, we consider the case when information on the observation gap is incomplete, that is, the starting time of intermittent dropout is known but the terminating time is not available. This incomplete information is modeled in terms of an interval-censored mechanism. Our proposed method is applied to the study of the Young Traffic Offenders Program on conviction rates, wherein a certain proportion of subjects experienced suspensions with intermittent dropouts during the study.

  4. Industrial accidents triggered by flood events: analysis of past accidents.

    PubMed

    Cozzani, Valerio; Campedel, Michela; Renni, Elisabetta; Krausmann, Elisabeth

    2010-03-15

    Industrial accidents triggered by natural events (NaTech accidents) are a significant category of industrial accidents. Several specific elements that characterize NaTech events still need to be investigated. In particular, the damage mode of equipment and the specific final scenarios that may take place in NaTech accidents are key elements for the assessment of hazard and risk due to these events. In the present study, data on 272 NaTech events triggered by floods were retrieved from some of the major industrial accident databases. Data on final scenarios highlighted the presence of specific events, as those due to substances reacting with water, and the importance of scenarios involving consequences for the environment. This is mainly due to the contamination of floodwater with the hazardous substances released. The analysis of process equipment damage modes allowed the identification of the expected release extents due to different water impact types during floods. The results obtained were used to generate substance-specific event trees for the quantitative assessment of the consequences of accidents triggered by floods.

  5. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    SciTech Connect

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-11-22

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility.

  6. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    PubMed

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2017-05-01

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  7. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  8. An analysis of the 2016 Hitomi breakup event

    NASA Astrophysics Data System (ADS)

    Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig

    2017-04-01

    The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.

  9. Time-quefrency analysis of overlapping similar microseismic events

    NASA Astrophysics Data System (ADS)

    Nagano, Koji

    2016-05-01

    In this paper, I describe a new technique to determine the interval between P-waves in similar, overlapping microseismic events. The similar microseismic events that occur with overlapping waveforms are called `proximate microseismic doublets' herein. Proximate microseismic doublets had been discarded in previous studies because we had not noticed their usefulness. Analysis of similar events can show relative locations of sources between them. Analysis of proximate microseismic doublets can provide more precise relative source locations because variation in the velocity structure has little influence on their relative travel times. It is necessary to measure the interval between the P-waves in the proximate microseismic doublets to determine their relative source locations. A `proximate microseismic doublet' is a pair of microseismic events in which the second event arrives before the attenuation of the first event. Cepstrum analysis can provide the interval even though the second event overlaps the first event. However, a cepstrum of a proximate microseismic doublet generally has two peaks, one representing the interval between the arrivals of the two P-waves, and the other representing the interval between the arrivals of the two S-waves. It is therefore difficult to determine the peak that represents the P-wave interval from the cepstrum alone. I used window functions in cepstrum analysis to isolate the first and second P-waves and to suppress the second S-wave. I change the length of the window function and calculate the cepstrum for each window length. The result is represented in a three-dimensional contour plot of length-quefrency-cepstrum data. The contour plot allows me to identify the cepstrum peak that represents the P-wave interval. The precise quefrency can be determined from a two-dimensional quefrency-cepstrum graph, provided that the length of the window is appropriately chosen. I have used both synthetic and field data to demonstrate that this

  10. Event-plane flow analysis without nonflow effects

    SciTech Connect

    Bilandzic, Ante; Kolk, Naomi van der; Ollitrault, Jean-Yves; Snellings, Raimond

    2011-01-15

    The event-plane method, which is widely used to analyze anisotropic flow in nucleus-nucleus collisions, is known to be biased by nonflow effects, especially at high p{sub t}. Various methods (cumulants, Lee-Yang zeros) have been proposed to eliminate nonflow effects, but their implementation is tedious, which has limited their application so far. In this article, we show that the Lee-Yang-zeroes method can be recast in a form similar to the standard event-plane analysis. Nonflow correlations are strongly suppressed by using the information from the length of the flow vector, in addition to the event-plane angle. This opens the way to improved analyses of elliptic flow and azimuthally sensitive observables at the Relativistic Heavy Ion Collider and the Large Hadron Collider.

  11. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  12. Drug Discrimination and the Analysis of Private Events.

    PubMed

    Kangas, Brian D; Maguire, David R

    2016-11-01

    A defining feature of radical behaviorism is the explicit inclusion of private events as material phenomena within a science of behavior. Surprisingly, however, despite much theorizing, there is a notable paucity within behavior analysis of controlled experimentation and analysis of private events, especially in nonhuman animals. One technique that is amenable to the study of private events is drug discrimination. For over 40 years, drug discrimination procedures have been an incredibly effective tool providing a wealth of in vivo pharmacological information about drugs including receptor selectivity, potency, and efficacy. In addition, this procedure has provided important preclinical indications of abuse liability. However, despite its prowess as a pharmacologic tool, or perhaps because of it, empirical investigation of its parameters, procedural elements, and variants is not currently an active research domain. This review highlights the drug discrimination procedure as a powerful means to systematically investigate private events by using drugs as interoceptive stimuli. In addition to the opportunity to study privacy, empirical evaluation of the drug discrimination procedure will likely inform and improve the standard practice for future endeavors in basic and clinical pharmacology.

  13. Drug Discrimination and the Analysis of Private Events

    PubMed Central

    Kangas, Brian D.; Maguire, David R.

    2016-01-01

    A defining feature of radical behaviorism is the explicit inclusion of private events as material phenomena within a science of behavior. Surprisingly, however, despite much theorizing, there is a notable paucity within behavior analysis of controlled experimentation and analysis of private events, especially in nonhuman animals. One technique that is amenable to the study of private events is drug discrimination. For over 40 years, drug discrimination procedures have been an incredibly effective tool providing a wealth of in vivo pharmacological information about drugs including receptor selectivity, potency, and efficacy. In addition, this procedure has provided important preclinical indications of abuse liability. However, despite its prowess as a pharmacologic tool, or perhaps because of it, empirical investigation of its parameters, procedural elements, and variants is not currently an active research domain. This review highlights the drug discrimination procedure as a powerful means to systematically investigate private events by using drugs as interoceptive stimuli. In addition to the opportunity to study privacy, empirical evaluation of the drug discrimination procedure will likely inform and improve the standard practice for future endeavors in basic and clinical pharmacology. PMID:27928551

  14. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  15. An analysis of selected atmospheric icing events on test cables

    SciTech Connect

    Druez, J.; McComber, P.; Laflamme, J.

    1996-12-01

    In cold countries, the design of transmission lines and communication networks requires the knowledge of ice loads on conductors. Atmospheric icing is a stochastic phenomenon and therefore probabilistic design is used more and more for structure icing analysis. For strength and reliability assessments, a data base on atmospheric icing is needed to characterize the distributions of ice load and corresponding meteorological parameters. A test site where icing is frequent is used to obtain field data on atmospheric icing. This test site is located on the Mt. Valin, near Chicoutimi, Quebec, Canada. The experimental installation is mainly composed of various instrumented but non-energized test cables, meteorological instruments, a data acquisition system, and a video recorder. Several types of icing events can produce large ice accretions dangerous for land-based structures. They are rime due to in-cloud icing, glaze caused by freezing rain, wet snow, and mixtures of these types of ice. These icing events have very different characteristics and must be distinguished, before statistical analysis, in a data base on atmospheric icing. This is done by comparison of data from a precipitation gauge, an icing rate meter and a temperature sensor. An analysis of selected icing periods recorded on the cables of two perpendicular test lines during the 1992--1993 winter season is presented. Only significant icing events have been considered. A comparative analysis of the ice load on the four test cables is drawn from the data, and typical accretion and shedding parameters are calculated separately for icing events related to in-cloud icing and precipitation icing.

  16. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  17. Topological Analysis of Emerging Bipole Clusters Producing Violent Solar Events

    NASA Astrophysics Data System (ADS)

    Mandrini, C. H.; Schmieder, B.; Démoulin, P.; Guo, Y.; Cristiani, G. D.

    2014-06-01

    During the rising phase of Solar Cycle 24 tremendous activity occurred on the Sun with rapid and compact emergence of magnetic flux leading to bursts of flares (C to M and even X-class). We investigate the violent events occurring in the cluster of two active regions (ARs), NOAA numbers 11121 and 11123, observed in November 2010 with instruments onboard the Solar Dynamics Observatory and from Earth. Within one day the total magnetic flux increased by 70 % with the emergence of new groups of bipoles in AR 11123. From all the events on 11 November, we study, in particular, the ones starting at around 07:16 UT in GOES soft X-ray data and the brightenings preceding them. A magnetic-field topological analysis indicates the presence of null points, associated separatrices, and quasi-separatrix layers (QSLs) where magnetic reconnection is prone to occur. The presence of null points is confirmed by a linear and a non-linear force-free magnetic-field model. Their locations and general characteristics are similar in both modelling approaches, which supports their robustness. However, in order to explain the full extension of the analysed event brightenings, which are not restricted to the photospheric traces of the null separatrices, we compute the locations of QSLs. Based on this more complete topological analysis, we propose a scenario to explain the origin of a low-energy event preceding a filament eruption, which is accompanied by a two-ribbon flare, and a consecutive confined flare in AR 11123. The results of our topology computation can also explain the locations of flare ribbons in two other events, one preceding and one following the ones at 07:16 UT. Finally, this study provides further examples where flare-ribbon locations can be explained when compared to QSLs and only, partially, when using separatrices.

  18. Velocity analysis with local event slopes related probability density function

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Lu, Wenkai; Zhang, Yingqiang

    2015-12-01

    Macro velocity model plays a key role in seismic imaging and inversion. The performance of traditional velocity analysis methods is degraded by multiples and amplitude-versus-offset (AVO) anomalies. Local event slopes, containing the subsurface velocity information, have been widely used to accomplish common time-domain seismic processing, imaging and velocity estimation. In this paper, we propose a method for velocity analysis with probability density function (PDF) related to local event slopes. We first estimate local event slopes with phase information in the Fourier domain. An adaptive filter is applied to improve the performance of slopes estimator in the low signal-to-noise ratio (SNR) situation. Second, the PDF is approximated with the histogram function, which is related to attributes derived from local event slopes. As a graphical representation of the data distribution, the histogram function can be computed efficiently. By locating the ray path of the first arrival on the semblance image with straight-ray segments assumption, automatic velocity picking is carried out to establish velocity model. Unlike local event slopes based velocity estimation strategies such as averaging filters and image warping, the proposed method does not make the assumption that the errors of mapped velocity values are symmetrically distributed or that the variation of amplitude along the offset is slight. Extension of the method to prestack time-domain migration velocity estimation is also given. With synthetic and field examples, we demonstrate that our method can achieve high resolution, even in the presence of multiples, strong amplitude variations and polarity reversals.

  19. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  20. Advanced analysis and event reconstruction for the CTA Observatory

    NASA Astrophysics Data System (ADS)

    Becherini, Y.; Khélifi, B.; Pita, S.; Punch, M.; CTA Consortium

    2012-12-01

    The planned Cherenkov Telescope Array (CTA) is a future observatory for very-high-energy (VHE) gamma-ray astronomy composed of one site per hemisphere [1]. It aims at 10 times better sensitivity, a better angular resolution and wider energy coverage than current installations such as H.E.S.S., MAGIC and VERITAS. In order to achieve this level of performance, both the design of the telescopes and the analysis algorithms are being studied and optimized within the CTA Monte-Carlo working group. Here, we present ongoing work on the data analysis for both the event reconstruction (energy, direction) and gamma/hadron separation, carried out within the HAP (H.E.S.S. Analysis Package) software framework of the H.E.S.S. collaboration, for this initial study. The event reconstruction uses both Hillas-parameter-based algorithms and an improved version of the 3D-Model algorithm [2]. For the gamma/hadron discrimination, original and robust discriminant variables are used and treated with Boosted Decision Trees (BDTs) in the TMVA [3] (Toolkit for Multivariate Data Analysis) framework. With this advanced analysis, known as Paris-MVA [4], the sensitivity is improved by a factor of ~ 2 in the core range of CTA relative to the standard analyses. Here we present the algorithms used for the reconstruction and discrimination, together with the resulting performance characteristics, with good confidence, since the method has been successfully applied for H.E.S.S.

  1. Empirical Green's function analysis of recent moderate events in California

    USGS Publications Warehouse

    Hough, S.E.

    2001-01-01

    I use seismic data from portable digital stations and the broadband Terrascope network in southern California to investigate radiated earthquake source spectra and discuss the results in light of previous studies on both static stress drop and apparent stress. Applying the empirical Green's function (EGF) method to two sets of M 4-6.1 events, I obtain deconvolved source-spectra estimates and corner frequencies. The results are consistent with an ??2 source model and constant Brune stress drop. However, consideration of the raw spectral shapes of the largest events provides evidence for a high-frequency decay more shallow than ??2. The intermediate (???f-1) slope cannot be explained plausibly with attenuation or site effects and is qualitatively consistent with a model incorporating directivity effects and a fractional stress-drop rupture process, as suggested by Haddon (1996). However, the results obtained in this study are not consistent with the model of Haddon (1996) in that the intermediate slope is not revealed with EGF analysis. This could reflect either bandwidth limitations inherent in EGF analysis or perhaps a rupture process that is not self-similar. I show that a model with an intermediate spectral decay can also reconcile the apparent discrepancy between the scaling of static stress drop and that of apparent stress drop for moderate-to-large events.

  2. Mixed-effects Poisson regression analysis of adverse event reports

    PubMed Central

    Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

    2008-01-01

    SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  3. A Dendrochronological Analysis of Mississippi River Flood Events

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  4. Precipitation analysis in the framework of flash flood events

    NASA Astrophysics Data System (ADS)

    Llasat, M. C.; Atencia, A.; Llasat-Botija, M.; Rigo, T.

    2009-04-01

    The analysis of all the precipitation structures associated to flash flood events in Catalonia, since 1996 has been done. A composed radar data from three C band radar data, with a 6 minutal temporal and 2x2 km spatial resolutions, provided by the Catalan National Meteorological Service (METEOCAT), has been used for the most recent events, meanwhile the historical information from the AEMET radar placed in Barcelona has also been used. These data have been compared with rainfall data in surface each 5-minutes using the SAIH and XEMA networks from the Generalitat of Catalonia. The different kind of structures, distinguishing between the convective and stratiform part, different Z-R calibrations as well as the cycle of life of the cells, have been considered and modelled. Results are being applied in the framework of the FLASH project.

  5. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  6. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  7. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  8. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  9. Computational particle physics for event generators and data analysis

    NASA Astrophysics Data System (ADS)

    Perret-Gallix, Denis

    2013-08-01

    High-energy physics data analysis relies heavily on the comparison between experimental and simulated data as stressed lately by the Higgs search at LHC and the recent identification of a Higgs-like new boson. The first link in the full simulation chain is the event generation both for background and for expected signals. Nowadays event generators are based on the automatic computation of matrix element or amplitude for each process of interest. Moreover, recent analysis techniques based on the matrix element likelihood method assign probabilities for every event to belong to any of a given set of possible processes. This method originally used for the top mass measurement, although computing intensive, has shown its efficiency at LHC to extract the new boson signal from the background. Serving both needs, the automatic calculation of matrix element is therefore more than ever of prime importance for particle physics. Initiated in the 80's, the techniques have matured for the lowest order calculations (tree-level), but become complex and CPU time consuming when higher order calculations involving loop diagrams are necessary like for QCD processes at LHC. New calculation techniques for next-to-leading order (NLO) have surfaced making possible the generation of processes with many final state particles (up to 6). If NLO calculations are in many cases under control, although not yet fully automatic, even higher precision calculations involving processes at 2-loops or more remain a big challenge. After a short introduction to particle physics and to the related theoretical framework, we will review some of the computing techniques that have been developed to make these calculations automatic. The main available packages and some of the most important applications for simulation and data analysis, in particular at LHC will also be summarized (see CCP2012 slides [1]).

  10. Analysis of a new extreme precipitation event in Reykjavik

    NASA Astrophysics Data System (ADS)

    Ólafsson, Haraldur; Ágústsson, Hálfdán

    2013-04-01

    On 28-29 December 2012 a new precipitation record of 70.4 mm in 24 hours was made in Reykjavik, Iceland. This extreme event is explored by means of observations and by numerical simulations by different models and different times of initialization. Several key factors in creating the precipitation extreme are identified: a) Slowly moving upper level low with high values of vorticity and vorticity advection. b) A south to north low-level temperature gradient set up by cold avection in the wake of a surface low and warm advection in easterly flow over Iceland, enhanced by the topography (foehn). This temperature gradient leads to strong vertical windshear with very weak winds at the surface, but up to 40 m/s from the SE in the upper troposphere. As there are no strong winds at low levels, there is hardly any precipitation shadow in Reykjavik, downstream of the Reykjanes mountains. In terms of considerable, but not extreme precipitation, the event was in general reasonably well forecasted 24 to 48 hours ahead. The above analysis leads to a method to identify extreme precipitation of this kind in large scale models. The method will be used to investigate the frequency of similar events in future climate scenarios.

  11. Meta-Analysis of Rare Binary Adverse Event Data

    PubMed Central

    Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.

    2013-01-01

    We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068

  12. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  13. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  14. An analysis of three nuclear events in P-Tunnel

    SciTech Connect

    Fourney, W.L.; Dick, R.D.; Taylor, S.R.; Weaver, T.A.

    1994-05-03

    This report examines experimental results obtained from three P Tunnel events -- Mission Cyber, Disko Elm, and Distant Zenith. The objective of the study was to determine if there were any differences in the explosive source coupling for the three events. It was felt that Mission Cyber might not have coupled well because the ground motions recorded for that event were much lower than expected based on experience from N Tunnel. Detailed examination of the physical and chemical properties of the tuff in the vicinity of each explosion indicated only minor differences. In general, the core samples are strong and competent out to at least 60 m from each working point. Qualitative measures of core sample strength indicate that the strength of the tuff near Mission Cyber may be greater than indicated by results of static testing. Slight differences in mineralogic content and saturation of the Mission Cyber tuff were noted relative to the other two tests, but probably would not result in large differences in ground motions. Examination of scaled free-field stress and acceleration records collected by Sandia National Laboratory (SNL) indicated that Disko Elm showed the least scatter and Distant Zenith the most scatter. Mission Cyber measurements tend to lie slightly below those of Distant Zenith, but still within two standard deviations. Analysis of regional seismic data from networks operated by Lawrence Livermore National Laboratory (LLNL) and SNL also show no evidence of Mission Cyber coupling low relative to the other two events. The overall conclusion drawn from the study is that there were no basic differences in the way that the explosions coupled to the rock.

  15. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  16. Human Reliability Analysis for Small Modular Reactors

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  17. Analytic Perturbation Analysis of Discrete Event Dynamic Systems

    SciTech Connect

    Uryasev, S.

    1994-09-01

    This paper considers a new Analytic Perturbation Analysis (APA) approach for Discrete Event Dynamic Systems (DEDS) with discontinuous sample-path functions with respect to control parameters. The performance functions for DEDS usually are formulated as mathematical expectations, which can be calculated only numerically. APA is based on new analytic formulas for the gradients of expectations of indicator functions; therefore, it is called an analytic perturbation analysis. The gradient of performance function may not coincide with the expectation of a gradient of sample-path function (i.e., the interchange formula for the gradient and expectation sign may not be valid). Estimates of gradients can be obtained with one simulation run of the models.

  18. Some approaches to the analysis of recurrent event data.

    PubMed

    Clayton, D

    1994-01-01

    Methodological research in biostatistics has been dominated over the last twenty years by further development of Cox's regression model for life tables and of Nelder and Wedderburn's formulation of generalized linear models. In both of these areas the need to address the problems introduced by subject level heterogeneity has provided a major motivation, and the analysis of data concerning recurrent events has been widely discussed within both frameworks. This paper reviews this work, drawing together the parallel development of 'marginal' and 'conditional' approaches in survival analysis and in generalized linear models. Frailty models are shown to be a special case of a random effects generalization of generalized linear models, whereas marginal models for multivariate failure time data are more closely related to the generalized estimating equation approach to longitudinal generalized linear models. Computational methods for inference are discussed, including the Bayesian Markov chain Monte Carlo approach.

  19. Hydrological analysis of flash flood events in Slovakia

    NASA Astrophysics Data System (ADS)

    Horvát, Oliver; Hlavcová, Kamila; Kohnová, Silvia; Borga, Marco; Szolgay, Ján.

    2010-05-01

    The paper concentrates on an analysis of three major flash floods in Slovakia, which occurred during recent years and caused great damage to property and also loss of lives. The flash floods selected occurred on the 20th of July, 1998, in the Malá Svinka and Dubovický creek basins; the 24th of July, 2001, at Štrbský Creek; and the 19th of June, 2004, at the Turniansky Creek. A description of the basins along with the selected flash floods is set out, and the results of the post-survey reconstruction of the flash flood events are described. To understand rainfall-runoff processes during these extreme flash floods and to test uncertainty of post-survey analyses, runoff responses during selected major events were examined using the KLEM (Kinematic Local Excess Model) spatially-distributed hydrological model. The distributed hydrological model is based on the availability of raster information of the landscape's topography, the soil and vegetation properties, and radar rainfall data. In the model, the SCS-Curve Number procedure is applied on a grid for the spatially-distributed representation of runoff-generating processes. A description of the drainage system response is used for representing the runoff's routing. The simulated values achieved by the KLEM model were compared with the maximum peaks estimated on the basis of post-event surveying and the results achieved are summarized and discussed. The consistency of the estimated and simulated values by the KLEM model was evident both in time and space, and the methodology has shown its applicability for practical purposes.

  20. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    PubMed Central

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  1. A Descriptive Analysis of Prehospital Response to Hazardous Materials Events.

    PubMed

    Martin, Ashley J; Lohse, Christine M; Sztajnkrycer, Matthew D

    2015-10-01

    Little is known about the overall frequency of hazardous materials (HazMat) events in the United States and the nature of prehospital care for those exposed. The purpose of the current study was to perform a descriptive analysis of Emergency Medical Services (EMS) activations reported to a national EMS database. Analysis of the 2012 National EMS Information System (NEMSIS) Public Release Research Data Set v.2.2.1, containing EMS emergency response data submitted by 41 states, was conducted. Mandatory data elements E0207 (Type of Response Delay), E0208 (Type of Scene Delay), and E0209 (Type of Transport Delay) contained specific codes for HazMat events and were used to identify specific EMS activation records for subsequent analysis. Overlapping data elements were identified and combined in order to prevent duplicate entries. Descriptive analyses were generated from the NEMSIS Research Data Set. A total of 17,479,328 EMS activations were reported, of which 2,527 unique activations involved HazMat response. Mass-casualty incident was coded for 5.6% of activations. The most common level of prehospital care present on scene was Basic Life Support (BLS; 51.1%); 2.1% required aggressive Advanced Life Support (ALS) response. The most common locations for HazMat activations were homes (36.2%), streets or highways (26.3%), and health care facilities (11.6%). The primary symptoms observed by EMS personnel were pain (29.6%), breathing problems (12.2%), and change in responsiveness (9.6%). Two percent of HazMat activations involved cardiac arrest, with 21.7% occurring after EMS arrival. Delays in patient care included response delay, scene delay, and transport delay. Hazardous materials events are rare causes of EMS activation in the United States. The majority occur in non-industrial venues and involve two or fewer patients. Scene time frequently is delayed due to multiple barriers. Cardiac arrest is rare but occurred after EMS arrival in one-fifth of patients.

  2. Analysis of a temperature inversion event in the lower mesosphere

    NASA Astrophysics Data System (ADS)

    Liu, Han-Li; Meriwether, John W.

    2004-01-01

    Rayleigh lidar measurements of stratospheric and lower mesospheric temperatures obtained at the Urbana Atmospheric Observatory (40.1°N, 88.1°W) on 17/18 November 1997 revealed large temperature inversions at altitudes between 55 to 65 km. Prior to and during a large increase (by up to about 50K) in the amplitude of the mesosphere inversion layer (MIL), a clear and persistent vertical wave structure between 30 and 65 km was observed. The wave has a vertical wavelength of about 12 km and an apparent period of about 12 hours. However, the intrinsic characteristics of the wave are uncertain due to the lack of information regarding the background wind profile and the relative direction of the wave propagation vector with respect to the background wind vector. Two different cases, corresponding to small and large background wind speeds projected onto the horizontal direction of wave propagation, are studied numerically to represent two scenarios with different intrinsic wave characteristics devised to explain the development of the MIL event observed. When the projected background wind is small, the wave is likely to be an inertial-gravity wave. It is shown that the breaking of such a wave does not produce the large heating rate observed. However, the numerical modeling shows that such an inertial-gravity wave can modulate the stability of a separate internal gravity wave, and the breaking of this internal gravity wave produces a heating rate similar to the observed rate. When the projected background wind is large, however, the observed wave could be an internal gravity wave with a large intrinsic phase speed. The analysis shows that the breaking of this wave can generate a large heating rate and a MIL that is similar to the observed event. We close with a discussion of the observational implications of these two scenarios. Possible wave sources are also discussed, and it appears that the observed MIL event might be related to a developing frontal system in the

  3. Collective analysis of ORPS-reportable electrical events (June, 2005-August 2009)

    SciTech Connect

    Henins, Rita J; Hakonson - Hayes, Audrey C

    2010-01-01

    The analysis of LANL electrical events between June 30, 2005 and August 31, 2009 provides data that indicate some potential trends regarding ISM failure modes, activity types associated with reportable electrical events, and ORPS causal codes. This report discusses the identified potential trends for Shock events and compares attributes of the Shock events against Other Electrical events and overall ORPS-reportable events during the same time frame.

  4. Whole-Genome Analysis of Gene Conversion Events

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Hao; Zhang, Yu; Hardison, Ross; Miller, Webb

    Gene conversion events are often overlooked in analyses of genome evolution. In a conversion event, an interval of DNA sequence (not necessarily containing a gene) overwrites a highly similar sequence. The event creates relationships among genomic intervals that can confound attempts to identify orthologs and to transfer functional annotation between genomes. Here we examine 1,112,202 paralogous pairs of human genomic intervals, and detect conversion events in about 13.5% of them. Properties of the putative gene conversions are analyzed, such as the lengths of the paralogous pairs and the spacing between their sources and targets. Our approach is illustrated using conversion events in the beta-globin gene cluster.

  5. Cluster analysis of indermediate deep events in the southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2015-04-01

    The Hellenic subduction zone (HSZ) is the seismically most active region in Europe where the oceanic African litosphere is subducting beneath the continental Aegean plate. Although there are numerous studies of seismicity in the HSZ, very few focus on the eastern HSZ and the Wadati-Benioff-Zone of the subducting slab in that part of the HSZ. In order to gain a better understanding of the geodynamic processes in the region a dense local seismic network is required. From September 2005 to March 2007, the temporary seismic network EGELADOS has been deployed covering the entire HSZ. It consisted of 56 onshore and 23 offshore broadband stations with addition of 19 stations from GEOFON, NOA and MedNet to complete the network. Here, we focus on a cluster of intermediate deep seismicity recorded by the EGELADOS network within the subducting African slab in the region of the Nysiros volcano. The cluster consists of 159 events at 80 to 190 km depth with magnitudes between 0.2 and 4.1 that were located using nonlinear location tool NonLinLoc. A double-difference earthquake relocation using the HypoDD software is performed with both manual readings of onset times and differential traveltimes obtained by separate cross correlation of P- and S-waveforms. Single event locations are compared to relative relocations. The event hypocenters fall into a thin zone close to the top of the slab defining its geometry with an accuracy of a few kilometers. At intermediate depth the slab is dipping towards the NW at an angle of about 30°. That means it is dipping steeper than in the western part of the HSZ. The edge of the slab is clearly defined by an abrupt disappearance of intermediate depths seismicity towards the NE. It is found approximately beneath the Turkish coastline. Furthermore, results of a cluster analysis based on the cross correlation of three-component waveforms are shown as a function of frequency and the spatio-temporal migration of the seismic activity is analysed.

  6. The Tunguska event and Cheko lake origin: dendrochronological analysis

    NASA Astrophysics Data System (ADS)

    Rosanna, Fantucci; Romano, Serra; Gunther, Kletetschka; Mario, Di Martino

    2015-07-01

    Dendrochronological research was carried out on 23 trees samples (Larix sibirica and Picea obovata) sampled during the 1999 expedition in two locations, close to the epicentre zone and near Cheko lake (N 60°57', E 101°51'). Basal Area Increment (BAI) analysis has shown a general long growth suppression before 1908, the year of Tunguska event (TE), followed by a sudden growth increase due to diminished competition of trees that died due to the event. In one group of the trees, we detected growth decrease for several years (due to damage to the trunk, branches and crown), followed by growth increase during the following 4-14 years. We show that trees that germinated after the TE, and living in close proximity of Cheko lake (Cheko lake trees) had different behaviour patterns when compared to those trees living further from Cheko lake, inside the forest (Forest trees). Cheko lake trees have shown a vigorous continuous growth increase. Forest trees have shown a vigorous growth during the first 10-30 years of age, followed by a period of suppressed growth. We interpret the suppressed growth by the re-established competition with the surroundings trees. Cheko lake pattern, however, is consistent with the formation of the lake at the time of TE. This observation supports the hypothesis that Cheko lake formation is due to a fragment originating during TE, creating a small impact crater into the permafrost and soft alluvial deposits of Kimku River plain. This is further supported by the fact that Cheko lake has an elliptical shape elongated towards the epicentre of TE.

  7. Chain of events analysis for a scuba diving fatality.

    PubMed

    Lippmann, John; Stevenson, Christopher; McD Taylor, David; Williams, Jo; Mohebbi, Mohammadreza

    2017-09-01

    A scuba diving fatality usually involves a series of related events culminating in death. Several studies have utilised a chain of events-type analysis (CEA) to isolate and better understand the accident sequence in order to facilitate the creation of relevant countermeasures. The aim of this research was to further develop and better define a process for performing a CEA to reduce potential subjectivity and increase consistency between analysts. To develop more comprehensive and better-defined criteria, existing criteria were modified and a template was created and tested using a CEA. Modifications comprised addition of a category for pre-disposing factors, expansion of criteria for the triggers and disabling agents present during the incident, and more specific inclusion criteria to better encompass a dataset of 56 fatalities. Four investigators (raters) used both the previous criteria and this template, in randomly assigned order, to examine a sample of 13 scuba diver deaths. Individual results were scored against the group consensus for the CEA. Raters' agreement consistency was compared using the Index of Concordance and intra-class correlation coefficients (ICC). The template is presented. The index of concordance between the raters increased from 62% (194⁄312) using the previous criteria to 82% (257⁄312) with use of this template indicating a substantially higher inter-rater agreement when allocating criteria. The agreement in scoring with and without template use was also quantified by ICC which were generally graded as low, illustrating a substantial change in consistency of scoring before and after template use. The template for a CEA for a scuba diving fatality improves consistency of interpretation between users and may improve comparability of diving fatality reports.

  8. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  9. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    PubMed

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Nurses' critical event risk assessments: a judgement analysis.

    PubMed

    Thompson, Carl; Bucknall, Tracey; Estabrookes, Carole A; Hutchinson, Alison; Fraser, Kim; de Vos, Rien; Binnecade, Jan; Barrat, Gez; Saunders, Jane

    2009-02-01

    To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses appear to be both misinterpreting and mismanaging the nursing-knowledge 'basics' such as heart rate, respiratory rate and oxygenation. Whilst many medical interventions originate from nurses, up to 26% of nurses' responses to abnormal signs result in delays of between one and three hours. A double system judgement analysis using Brunswik's lens model of cognition was undertaken with 245 Dutch, UK, Canadian and Australian acute care nurses. Nurses were asked to judge the likelihood of a critical event, 'at-risk' status, and whether they would intervene in response to 50 computer-presented clinical scenarios in which data on heart rate, systolic blood pressure, urine output, oxygen saturation, conscious level and oxygenation support were varied. Nurses were also presented with a protocol recommendation and also placed under time pressure for some of the scenarios. The ecological criterion was the predicted level of risk from the Modified Early Warning Score assessments of 232 UK acute care inpatients. Despite receiving identical information, nurses varied considerably in their risk assessments. The differences can be partly explained by variability in weightings given to information. Time and protocol recommendations were given more weighting than clinical information for key dichotomous choices such as classifying a patient as 'at risk' and deciding to intervene. Nurses' weighting of cues did not mirror the same information's contribution to risk in real patients. Nurses synthesized information in non-linear ways that contributed little to decisional accuracy. The low-moderate achievement (R(a)) statistics suggests that nurses' assessments of risk were largely inaccurate

  11. Analysis and Simulations of Space Radiation Induced Single Event Transients

    NASA Astrophysics Data System (ADS)

    Perez, Reinaldo

    2016-05-01

    Spacecraft electronics are affected by the space radiation environment. Among the different types of radiation effects that can affect spacecraft electronics is the single event transients. The space environment is responsible for many of the single event transients which can upset the performance of the spacecraft avionics hardware. In this paper we first explore the origins of single event transients, then explore the modeling of a single event transient in digital and analog circuit. The paper also addresses the concept of crosstalk that could develop among digital circuits in the present of a SET event. The paper ends with a brief discussion of SET hardening. The goal of the paper is to provide methodologies for assessing single event transients and their effects so that spacecraft avionics engineers can develop either hardware or software countermeasures in their designs.

  12. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  13. Teleseismic Events Analysis with AQDB and ITAB stations, Brazil

    NASA Astrophysics Data System (ADS)

    Felício, L. D.; Vasconcello, E.; Assumpção, M.; Rodrigues, F.; Facincani, E.; Dias, F.

    2013-05-01

    This work aims to preferentially conduct the survey of seismic activity coming from the Andean region at distance over 1500 km recorded by Brazilian seismographic stations of AQDB and ITAB in 2012. The stations are located in the cities of Aquidauana and Itajai, both in central-west region in Brazil, with coordinates -20°48'S;-55°70'W and -27°24'S;-52°13'W, respectively. We determined the magnitudes mb and Ms,epicentral distance, arrival times of P waves experimental and theoretical (using IASP91 model) . With the programs SAC (SEISMIC ANALYSIS CODE), TAUP and Seisgram (Seismogram Viewer), it was possible to determine the mentioned magnitudes. We identified around twenty events for each station and it was possible to correlate the magnitude data published in the Bulletin National Earthquake Information Center (NEIC) generating a correlation between the calculated magnitudes (AQDB and ITAB).. The linear regression shows that the two stations mb and Ms magnitude are close to the values reported by the NEIC (97.1% correlation mb and Ms 96.5%). Regarding the P-wave arrive times at stations ITAB and AQDB indicate an average variation of 2.2 and 2.7 seconds respectively, in other words, the time difference of the waves P (experimental and theoretical) may be related to positioning each station and the heterogeneity of the structure and composition of the rocky massive in each region.

  14. Descriptive Analysis of Air Force Non-Fatal Suicide Events

    DTIC Science & Technology

    2006-07-01

    hospitalizations is therefore severely limited. 2 RESULTS Matched Records As described in Table 1, the Capture dataset contained 1089 NFSE and the Recapture...surveillance database. Specifically, of the 1089 NFSE in the Capture dataset, 658 (60.4%) had a corresponding entry in SADR. When suicide event-related E...SADR SESS events matched to SADR data +/- 1089 1, 2, 3 days from event date Recapture SESS, SADR/SIDR pulled by E code in E950- 1842 SADR, SIDR E959

  15. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  16. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  17. Regional Frequency Analysis of extreme rainfall events, Tuscany (Italy)

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Chiarello, V.; Rossi, G.

    2014-12-01

    The assessment of extreme hydrological events at sites characterized by short time series or where no data record exists has been mainly obtained by regional models. Regional frequency analysis based on the index variable procedure is implemented here to describe the annual maximum of rainfall depth of short durations in Tuscany region. The probability distribution TCEV - Two Component Extreme Value is used in the frame of the procedure for the parameters estimation based on a three levels hierarchical approach. The methodology deal with the delineation of homogeneous regions, the identification of a robust regional frequency distribution and the assessment of the scale factor, i.e. the index rainfall. The data set includes the annual maximum of daily rainfall of 351 gauge stations with at least 30 years of records, in the period 1916 - 2012, and the extreme rainfalls of short duration, 1 hour and 3, 6, 12, 24 hours. Different subdivisions hypotheses have been verified. A four regions subdivision, coincident with four subregions, which takes into account the orography, the geomorphological and climatic peculiarities of the Tuscany region, has been adopted. Particularly, for testing the regional homogeneity, the cumulate frequency distributions of the observed skewness and variation coefficients of the recorded times series, are compared with the theoretical frequency distribution obtained through a Monte Carlo technique. The related L-skewness and L-variation coefficients are also examined. The application of the Student t -test and the Wilcoxon test for the mean, as well as the χ2 was also performed. Further tests of subdivision hypotheses have been made through the application of discordancy D and heterogeneity H tests and the analysis of the observed and the theoretical TCEV model growth curves. For each region the daily rainfall growth curve has been estimated. The growth curves for the hourly duration have been estimated when the daily rainfall growth curve

  18. BAYESIAN ANALYSIS OF REPEATED EVENTS USING EVENT-DEPENDENT FRAILTY MODELS: AN APPLICATION TO BEHAVIORAL OBSERVATION DATA

    PubMed Central

    Snyder, James

    2009-01-01

    In social interaction studies, one commonly encounters repeated displays of behaviors along with their duration data. Statistical methods for the analysis of such data use either parametric (e.g., Weibull) or semi-nonparametric (e.g., Cox) proportional hazard models, modified to include random effects (frailty) which account for the correlation of repeated occurrences of behaviors within a unit (dyad). However, dyad-specific random effects by themselves are not able to account for the ordering of event occurrences within dyads. The occurrence of an event (behavior) can make further occurrences of the same behavior to be more or less likely during an interaction. This paper develops event-dependent random effects models for analyzing repeated behaviors data using a Bayesian approach. The models are illustrated by a dataset relating to emotion regulation in families with children who have behavioral or emotional problems. PMID:20161593

  19. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    NASA Astrophysics Data System (ADS)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  20. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  1. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  2. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  3. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  4. Subchannel analysis of multiple CHF events. [PWR; BWR

    SciTech Connect

    Reddy, D.G.; Fighetti, C.F.

    1982-08-01

    The phenomenon of multiple CHF events in rod bundle heat transfer tests, referring to the occurrence of CHF on more than one rod or at more than one location on one rod is examined. The adequacy of some of the subchannel CHF correlations presently used in the nuclear industry in predicting higher order CHF events is ascertained based on local coolant conditions obtained with the COBRA IIIC subchannel code. The rod bundle CHF data obtained at the Heat Transfer Research Facility of Columbia University are examined for multiple CHF events using a combination of statistical analyses and parametric studies. The above analyses are applied to the study of three data sets of tests simulating both PWR and BWR reactor cores with uniform and non-uniform axial heat flux distributions. The CHF correlations employed in this study include: (1) CE-1 correlation, (2) B and W-2 correlation, (3) W-3 correlation, and (4) Columbia correlation.

  5. CDAW 9 analysis of magnetospheric events on May 3, 1986 - Event C

    NASA Technical Reports Server (NTRS)

    Baker, D. N.; Pulkkinen, T. I.; Mcpherron, R. L.; Craven, J. D.; Frank, L. A.; Elphinstone, R. D.; Murphree, J. S.; Fennell, J. F.; Lopez, R. E.; Nagai, T.

    1993-01-01

    An intense geomagnetic substorm event on May 3, 1986, occurring toward the end of a strong storm period, is studied. The auroral electrojet indices and global imaging data from both the Northern and Southern Hemispheres clearly revealed the growth phase and expansion phase development for a substorm with an onset at 0111 UT. An ideally located constellation of four spacecraft allowed detailed observation of the substorm growth phase in the near-tail region. A realistic time-evolving magnetic field model provided a global representation of the field configuration throughout the growth and early expansion phase of the substorm. Evidence of a narrowly localized substorm onset region in the near-earth tail is found. This region spread rapidly eastward and poleward after the 0111 UT onset. The results are consistent with a model of late growth phase formation of a magnetic neutral line. This reconnection region caused plasma sheet current diversion before the substorm onset and eventually led to cross-tail current disruption at the time of the substorm onset.

  6. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  7. Analysis of cumulus solar irradiance reflectance (CSIR) events

    NASA Astrophysics Data System (ADS)

    Laird, John L.; Harshvardhan

    Clouds are extremely important with regard to the transfer of solar radiation at Earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When Sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using UVA and UVB pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Win -2 and 0.0169 Wm -2 were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of Sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed. C 1997 Elsevier Science B.V.

  8. The design and analysis of randomized trials with recurrent events.

    PubMed

    Cook, R J

    1995-10-15

    This paper describes a method for planning the duration of a randomized parallel group study in which the response of interest is a potentially recurrent event. At the design stage we assume patients accrue at a constant rate, we model events via a homogeneous Poisson process, and we utilize an independent exponential censoring mechanism to reflect loss to follow-up. We derive the appropriate study duration to ensure satisfaction of power requirements for the effect size of interest under a Poisson regression model. An application to a kidney transplant study illustrates the potential savings of the Poisson-based design relative to a design based on the time to the first event. Revised design criteria are also derived to accommodate overdispersed Poisson count data. We examine the frequency properties of two non-parametric tests recently proposed by Lawless and Nadeau for trials based on the above design criteria. In simulation studies involving homogeneous and non-homogeneous Poisson processes they performed well with respect to their type I error rate and power. Results from supplementary simulation studies indicate that these tests are also robust to extra-Poisson variation and to clustering in the event times, making these tests attractive in their generality. We illustrate both tests by application to data from a completed kidney transplant study.

  9. Further Evaluation of Antecedent Social Events during Functional Analysis

    ERIC Educational Resources Information Center

    Kuhn, David E.; Hardesty, Samantha L.; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors…

  10. Awareness and analysis of a significant event by general practitioners: a cross sectional survey.

    PubMed

    Bowie, P; McKay, J; Norrie, J; Lough, M

    2004-04-01

    To determine the extent to which general practitioners (GPs) were aware of a recent significant event and whether a structured analysis of this event was undertaken to minimise the perceived risk of recurrence. Cross sectional survey using a postal questionnaire. Greater Glasgow primary care trust. 466 principals in general practice from 188 surgeries. GPs' self-reported personal and practice characteristics, awareness of a recent significant event, participation in the structured analysis of the identified significant event, perceived chance of recurrence, forums for discussing and analysing significant events, and levels of primary care team involvement. Four hundred and sixty six GPs (76%) responded to the survey. GPs from single handed practices were less likely to respond than those in multi-partner training and non-training practices. 401 (86%) reported being aware of a recent significant event; lack of awareness was clearly associated with GPs from non-training practices. 219 (55%) had performed all the necessary stages of a structured analysis (as determined by the authors) of the significant event. GPs from training practices were more likely to report participation in the structured analysis of the recent event, to perceive the chance of this event recurring as "nil" or "very low", and to report significant event discussions taking place. Most GPs were aware of a recent significant event and participated in the structured analysis of this event. The wider primary care team participated in the analysis process where GPs considered this involvement relevant. There is variation in the depth of and approach to significant event analysis within general practice, which may have implications for the application of the technique as part of the NHS quality agenda.

  11. Event-related complexity analysis and its application in the detection of facial attractiveness.

    PubMed

    Deng, Zhidong; Zhang, Zimu

    2014-11-01

    In this study, an event-related complexity (ERC) analysis method is proposed and used to explore the neural correlates of facial attractiveness detection in the context of a cognitive experiment. The ERC method gives a quantitative index for measuring the diverse brain activation properties that represent the neural correlates of event-related responses. This analysis reveals distinct effects of facial attractiveness processing and also provides further information that could not have been achieved from event-related potential alone.

  12. Novel adverse events of bevacizumab in the US FDA adverse event reporting system database: a disproportionality analysis.

    PubMed

    Shamloo, Behrooz K; Chhabra, Pankdeep; Freedman, Andrew N; Potosky, Arnold; Malin, Jennifer; Weiss Smith, Sheila

    2012-06-01

    Bevacizumab is the first in its class, vascular endothelial growth factor (VEGF) inhibitor that was initially approved by the US FDA in 2004 for the treatment of metastatic colon cancer and other solid tumors. Preapproval clinical trials, particularly for oncology drugs, are limited in their ability to detect certain adverse effects and, therefore, the FDA and pharmaceutical sponsors collect and monitor reports of adverse events (AEs) following approval. The purpose of this study was to screen the FDA's Adverse Event Reporting System (AERS) database for novel AEs that may be attributed to bevacizumab. The FDA AERS database was used to identify all AE reports for bevacizumab from February 2004 to September 2009. Disproportionality analysis was conducted for bevacizumab against all other drugs in the background by setting statistical significance at proportional reporting ratio (PRR) ≥2, observed case count ≥3 and chi-square ≥4. Subsequent clinical evaluation was performed to determine the clinical relevance of the findings and to group related events. A total of 523 Preferred Terms (PTs) were disproportionally reported; following clinical review 63 (12%) were found to be both unlabelled and of clinical importance. These PTs were grouped into 15 clinical disorder groups. Among the clinical disorders, electrolyte abnormalities had the greatest number of reports (n = 426) followed by cardiovascular events (n = 421), gastrointestinal events (n = 345), nervous system disorders (n = 106) and pneumonitis (n = 96). On sensitivity analysis, a number of clinically important unlabelled disorders, such as necrotizing fasciitis, vessel wall disorders, arrhythmia and conduction disorder and autoimmune thrombocytopenia still met the statistical significance criteria. During the study period, out of 12 010 AE reports mentioning bevacizumab, it was listed as the suspect drug in 94.2% of the reports. Our disproportionality analysis identified many events

  13. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  14. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  15. Analysis and RHBD technique of single event transients in PLLs

    NASA Astrophysics Data System (ADS)

    Zhiwei, Han; Liang, Wang; Suge, Yue; Bing, Han; Shougang, Du

    2015-11-01

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented.

  16. Probability distribution analysis of observational extreme events and model evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.

    2016-12-01

    Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.

  17. Observation and Analysis of Jovian and Saturnian Satellite Mutual Events

    NASA Technical Reports Server (NTRS)

    Tholen, David J.

    2001-01-01

    The main goal of this research was to acquire high time resolution photometry of satellite-satellite mutual events during the equatorial plane crossing for Saturn in 1995 and Jupiter in 1997. The data would be used to improve the orbits of the Saturnian satellites to support Cassini mission requirements, and also to monitor the secular acceleration of Io's orbit to compare with heat flow measurements.

  18. Further evaluation of antecedent social events during functional analysis.

    PubMed

    Kuhn, David E; Hardesty, Samantha L; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors increased as a function of the altered behavior of the therapist without direct manipulation of states of satiation or deprivation.

  19. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  20. Narrative, Event-Structure Analysis, and Causal Interpretation in Historical Sociology.

    ERIC Educational Resources Information Center

    Griffin, Larry J.

    1993-01-01

    Contends that recent developments in historical sociology emphasize the centrality of temporality to analysis and explanation. Illustrates how computer-assisted analysis of qualitative narrative can be used to develop generalizable causal interpretation of events. (CFR)

  1. Analysis of the September 2010 Los Angeles Extreme Heating Event

    NASA Astrophysics Data System (ADS)

    King, K. C.; Kaplan, M. L.; Smith, C.; Tilley, J.

    2015-12-01

    The Southern California coastal region has a temperate climate, however, there are days with extreme heating where temperatures may reach above 37°C, stressing the region's power grid, leading to health issues, and creating environments susceptible to fires. These extreme localized heating events occur over a short period, from a few hours to one to two days and may or may not occur in conjunction with high winds. The Santa Ana winds are a well-studied example of this type of phenomena. On September 27, 2010, Los Angeles, CA (LA), reached a record maximum temperature of 45°C during an extreme heating event that was not a Santa Ana event. We analyzed the event using observations, reanalysis data, and mesoscale simulations with the Weather Research and Forecasting Model (WRF) to understand the mechanisms of extreme heating and provide guidance on forecasting similar events. On 26 September 2010, a large synoptic ridge overturned and broke over the midwestern United States (US), driving momentum and internal energy to the southwest. A large pool of hot air at mid-levels over the four-corners region also shifted west, moving into southern California by 26 September. This hot air resided over the LA basin, just above the surface, by 00 GMT on 27 September. At this time, the pressure gradient at low levels was weak. Based on WRF model and wind profiler/RASS observations, we propose that separate mountain-plains solenoids (MPS) occurred on both 26 and 27 of September. The MPS on 26 September moved the hot air into place just above the surface over the LA basin. Overnight, the hot air is trapped near the surface due to the action of gravity waves in conjunction with orographic density currents and remnant migrating solenoids that form over the mountains surrounding LA. When the MPS forms during the late morning on the 27th, the descending return branch flow plus surface sensible heating creates a mechanism to move the heat to the surface, leading to record temperatures.

  2. Applying Association Rule of the Data Mining Method for the Network Event Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Wankyung; Soh, Wooyoung

    2007-12-01

    Network event analysis gives useful information on the network status that helps protect from attacks. It involves finding sets of frequently used packet information such as IP addresses and requires real-time processing by its nature. This paper applies association rules to network event analysis. Originally association rules used for data mining can be applied to find frequent item sets. So, if frequent items occur on networks, information system can guess that there is a threat. But existed association rules such as Apriori algorithm are not suitable for analyzing network events on real-time due to the high usage of CPU and memory and thus low processing speed. This paper develops a network event audit module by applying association rules to network events using a new algorithm instead of Apriori algorithm. Test results show that the application of the new algorithm gives drastically low usage of both CPU and memory for network event analysis compared with existing Apriori algorithm.

  3. Analysis of Adverse Events in Identifying GPS Human Factors Issues

    NASA Technical Reports Server (NTRS)

    Adams, Catherine A.; Hwoschinsky, Peter V.; Adams, Richard J.

    2004-01-01

    The purpose of this study was to analyze GPS related adverse events such as accidents and incidents (A/I), Aviation Safety Reporting System (ASRS) reports and Pilots Deviations (PDs) to create a framework for developing a human factors risk awareness program. Although the occurrence of directly related GPS accidents is small the frequency of PDs and ASRS reports indicated there is a growing problem with situational awareness in terminal airspace related to different types of GPs operational issues. This paper addresses the findings of the preliminary research and a brief discussion of some of the literature on related GPS and automation issues.

  4. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    D. M. Rasmuson; D. L. Kelly

    2008-06-01

    This paper reviews the basic concepts of modelling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group.

  5. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  6. Identification and analysis of alternative splicing events in Phaseolus vulgaris and Glycine max.

    PubMed

    Iñiguez, Luis P; Ramírez, Mario; Barbazuk, William B; Hernández, Georgina

    2017-08-22

    The vast diversification of proteins in eukaryotic cells has been related with multiple transcript isoforms from a single gene that result in alternative splicing (AS) of primary transcripts. Analysis of RNA sequencing data from expressed sequence tags and next generation RNA sequencing has been crucial for AS identification and genome-wide AS studies. For the identification of AS events from the related legume species Phaseolus vulgaris and Glycine max, 157 and 88 publicly available RNA-seq libraries, respectively, were analyzed. We identified 85,570 AS events from P. vulgaris in 72% of expressed genes and 134,316 AS events in 70% of expressed genes from G. max. These were categorized in seven AS event types with intron retention being the most abundant followed by alternative acceptor and alternative donor, representing ~75% of all AS events in both plants. Conservation of AS events in homologous genes between the two species was analyzed where an overrepresentation of AS affecting 5'UTR regions was observed for certain types of AS events. The conservation of AS events was experimentally validated for 8 selected genes, through RT-PCR analysis. The different types of AS events also varied by relative position in the genes. The results were consistent in both species. The identification and analysis of AS events are first steps to understand their biological relevance. The results presented here from two related legume species reveal high conservation, over ~15-20 MY of divergence, and may point to the biological relevance of AS.

  7. Analysis of broadband seismograms from selected IASPEI events

    USGS Publications Warehouse

    Choy, G.L.; Engdahl, E.R.

    1987-01-01

    Broadband seismograms of body waves that are flat to displacement and velocity in the frequency range from 0.01 to 5.0 Hz can now be routinely obtained for most earthquakes of magnitude greater than about 5.5. These records are obtained either directly or through multichannel deconvolution of waveforms from digitally recording seismograph stations. In contrast to data from conventional narrowband seismographs, broadband records have sufficient frequency content to define the source-time functions of body waves, even for shallow events for which the source functions of direct and surface-reflected phases may overlap. Broadband seismograms for selected IASPEI events are systematically analysed to identify depth phases and the presence of subevents. The procedure results in improved estimates of focal depth, identification of subevents in complex earthquakes, and better resolution of focal mechanisms. We propose that it is now possible for reporting agencies, such as the National Earthquake Information Center, to use broadband digital waveforms routinely in the processing of earthquake data. ?? 1987.

  8. Chemical supply chain modeling for analysis of homeland security events

    SciTech Connect

    Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.; Eidson, Eric D.; Jones, Brian S.

    2013-09-06

    The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operations (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.

  9. Analysis of hypoglycemic events using negative binomial models.

    PubMed

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia.

  10. Glaucoma progression detection: agreement, sensitivity, and specificity of expert visual field evaluation, event analysis, and trend analysis.

    PubMed

    Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier

    2013-01-01

    To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.

  11. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  12. Chemical supply chain modeling for analysis of homeland security events

    DOE PAGES

    Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.; ...

    2013-09-06

    The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operationsmore » (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.« less

  13. Genome-Wide Analysis of Polyadenylation Events in Schmidtea mediterranea

    PubMed Central

    Lakshmanan, Vairavan; Bansal, Dhiru; Kulkarni, Jahnavi; Poduval, Deepak; Krishna, Srikar; Sasidharan, Vidyanand; Anand, Praveen; Seshasayee, Aswin; Palakodeti, Dasaradhi

    2016-01-01

    In eukaryotes, 3′ untranslated regions (UTRs) play important roles in regulating posttranscriptional gene expression. The 3′UTR is defined by regulated cleavage/polyadenylation of the pre-mRNA. The advent of next-generation sequencing technology has now enabled us to identify these events on a genome-wide scale. In this study, we used poly(A)-position profiling by sequencing (3P-Seq) to capture all poly(A) sites across the genome of the freshwater planarian, Schmidtea mediterranea, an ideal model system for exploring the process of regeneration and stem cell function. We identified the 3′UTRs for ∼14,000 transcripts and thus improved the existing gene annotations. We found 97 transcripts, which are polyadenylated within an internal exon, resulting in the shrinking of the ORF and loss of a predicted protein domain. Around 40% of the transcripts in planaria were alternatively polyadenylated (ApA), resulting either in an altered 3′UTR or a change in coding sequence. We identified specific ApA transcript isoforms that were subjected to miRNA mediated gene regulation using degradome sequencing. In this study, we also confirmed a tissue-specific expression pattern for alternate polyadenylated transcripts. The insights from this study highlight the potential role of ApA in regulating the gene expression essential for planarian regeneration. PMID:27489207

  14. Analysis of sequential events in intestinal absorption of folylpolyglutamate

    SciTech Connect

    Darcy-Vrillon, B.; Selhub, J.; Rosenberg, I.H.

    1988-09-01

    Although it is clear that the intestinal absorption of folylpolyglutamates is associated with hydrolysis to monoglutamyl folate, the precise sequence and relative velocity of the events involved in this absorption are not fully elucidated. In the present study, we used biosynthetic, radiolabeled folylpolyglutamates purified by affinity chromatography to analyze the relationship of hydrolysis and transport in rat jejunal loops in vivo. Absorption was best described by a series of first-order processes: luminal hydrolysis to monoglutamyl folate followed by tissue uptake of the product. The rate of hydrolysis in vivo was twice as high as the rate of transport. The latter value was identical to that measured for folic acid administered separately. The relevance of this sequential model was confirmed by data obtained using inhibitors of the individual steps in absorption of ''natural'' folate. Heparin and sulfasalazine were both effective in decreasing absorption. The former affected hydrolysis solely, whereas the latter acted as a competitive inhibitor of transport of monoglutamyl folate. These studies confirm that hydrolysis is obligatory and that the product is subsequently taken up by a transport process, common to monoglutamyl folates, that is the rate-determining step in transepithelial absorption.

  15. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  16. An analysis of fog events at Belgrade International Airport

    NASA Astrophysics Data System (ADS)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  17. Twelve Tips for Promoting Significant Event Analysis To Enhance Reflection in Undergraduate Medical Students.

    ERIC Educational Resources Information Center

    Henderson, Emma; Berlin, Anita; Freeman, George; Fuller, Jon

    2002-01-01

    Points out the importance of the facilitation of reflection and development of reflective abilities in professional development and describes 12 tips for undergraduate medical students to increase their abilities of writing reflective and creative event analysis. (Author/YDS)

  18. Subjective well-being and adaptation to life events: a meta-analysis.

    PubMed

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E

    2012-03-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on affective and cognitive well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to 4 family events (marriage, divorce, bereavement, childbirth) and 4 work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given.

  19. Spectral analysis of snoring events from an Emfit mattress.

    PubMed

    Perez-Macias, Jose Maria; Viik, Jari; Varri, Alpo; Himanen, Sari-Leena; Tenhunen, Mirja

    2016-12-01

    The aim of this study is to explore the capability of an Emfit (electromechanical film transducer) mattress to detect snoring (SN) by analyzing the spectral differences between normal breathing (NB) and SN. Episodes of representative NB and SN of a maximum of 10 min were visually selected for analysis from 33 subjects. To define the bands of interest, we studied the statistical differences in the power spectral density (PSD) between both breathing types. Three bands were selected for further analysis: 6-16 Hz (BW1), 16-30 Hz (BW2) and 60-100 Hz (BW3). We characterized the differences between NB and SN periods in these bands using a set of spectral features estimated from the PSD. We found that 15 out of the 29 features reached statistical significance with the Mann-Whitney U-test. Diagnostic properties for each feature were assessed using receiver operating characteristic analysis. According to our results, the highest diagnostic performance was achieved using the power ratio between BW2 and BW3 (0.85 area under the receiver operating curve, 80% sensitivity, 80% specificity and 80% accuracy). We found that there are significant differences in the defined bands between the NB and SN periods. A peak was found in BW3 for SN epochs, which was best detected using power ratios. Our work suggests that it is possible to detect snoring with an Emfit mattress. The mattress-type movement sensors are inexpensive and unobtrusive, and thus provide an interesting tool for sleep research.

  20. Efficient stochastic sensitivity analysis of discrete event systems

    SciTech Connect

    Plyasunov, Sergey . E-mail: teleserg@uclink.berkeley.edu; Arkin, Adam P. . E-mail: aparkin@lbl.gov

    2007-02-10

    Sensitivity analysis quantifies the dependence of a system's behavior on the parameters that could possibly affect the dynamics. Calculation of sensitivities of stochastic chemical systems using Kinetic Monte Carlo and finite-difference-based methods is not only computationally intensive, but direct calculation of sensitivities by finite-difference-based methods of parameter perturbations converges very poorly. In this paper we develop an approach to this issue using a method based on the Girsanov measure transformation for jump processes to smooth the estimate of the sensitivity coefficients and make this estimation more accurate. We demonstrate the method with simple examples and discuss its appropriate use.

  1. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  2. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study.

  3. Analysis of upwelling event in Southern Makassar Strait

    NASA Astrophysics Data System (ADS)

    Utama, F. G.; Atmadipoera, A. S.; Purba, M.; Sudjono, E. H.; Zuraida, R.

    2017-01-01

    The southeast monsoon (SEM) winds which blow in southern Makassar Strait, generate the coastal upwelling phenomenon. The wind data for one year, which is equipped with CTD data from MAJAFLOX cruise results, is used to analyze the phenomenon of upwelling in this region. During the SEM 2015 occurrence, the southeasterly winds speed were in average of 6 m/s, while the highest speed appeared in August and September. Using the Ekman theory’s analysis of upwelling during this monsoon period, we could estimate the Ekman transport was about 8.50 m2/s toward offshore (to the Southwest direction); the upwelled water, occurred from deeper layer, started from the coastal area with vertical velocity was about 6.87 x 10-5 m/s – 7.84 x 10-5 m/s; and The Ekman layer depth in the upwelling region was approximately 60 m and these were good agreement with CTD observation result.

  4. Identifying causes of adverse events detected by an automated trigger tool through in-depth analysis.

    PubMed

    Muething, S E; Conway, P H; Kloppenborg, E; Lesko, A; Schoettker, P J; Seid, M; Kotagal, U

    2010-10-01

    To describe how in-depth analysis of adverse events can reveal underlying causes. Triggers for adverse events were developed using the hospital's computerised medical record (naloxone for opiate-related oversedation and administration of a glucose bolus while on insulin for insulin-related hypoglycaemia). Triggers were identified daily. Based on information from the medical record and interviews, a subject expert determined if an adverse drug event had occurred and then conducted a real-time analysis to identify event characteristics. Expert groups, consisting of frontline staff and specialist physicians, examined event characteristics and determined the apparent cause. 30 insulin-related hypoglycaemia events and 34 opiate-related oversedation events were identified by the triggers over 16 and 21 months, respectively. In the opinion of the experts, patients receiving continuous-infusion insulin and those receiving dextrose only via parenteral nutrition were at increased risk for insulin-related hypoglycaemia. Lack of standardisation in insulin-dosing decisions and variation regarding when and how much to adjust insulin doses in response to changing glucose levels were identified as common causes of the adverse events. Opiate-related oversedation events often occurred within 48 h of surgery. Variation in pain management in the operating room and post-anaesthesia care unit was identified by the experts as potential causes. Variations in practice, multiple services writing orders, multidrug regimens and variations in interpretation of patient assessments were also noted as potential contributing causes. Identification of adverse drug events through an automated trigger system, supplemented by in-depth analysis, can help identify targets for intervention and improvement.

  5. Analysis of "never events" following adult cardiac surgical procedures in the United States.

    PubMed

    Robich, Michael P; Krafcik, Brianna M; Shah, Nishant K; Farber, Alik; Rybin, Denis; Siracuse, Jeffrey J

    2017-10-01

    This study was conducted to determine the risk factors, nature, and outcomes of "never events" following open adult cardiac surgical procedures. Understanding of these events can reduce their occurrence, and thereby improve patient care, quality metrics, and cost reduction. "Never events" for patients included in the Nationwide Inpatient Sample who underwent coronary artery bypass graft, heart valve repair/replacement, or thoracic aneurysm repair between 2003-2011 were documented. These events included air embolism, catheter-based urinary tract infection (UTI), pressure ulcer, falls/trauma, blood incompatibility, vascular catheter infection, poor glucose control, foreign object retention, wrong site surgery and mediastinitis. Analysis included characterization of preoperative demographics, comorbidities and outcomes for patients sustaining never events, and multivariate analysis of predictive risk factors and outcomes. A total of 588,417 patients meeting inclusion criteria were identified. Of these, never events occurred in 4377 cases. The majority of events were in-hospital falls, vascular catheter infections, and complications of poor glucose control. Rates of falls, catheter based UTIs, and glucose control complications increased between 2009-2011 as compared to 2003-2008. Analysis revealed increased hospital length of stay, hospital charges, and mortality in patients who suffered a never event as compared to those that did not. This study establishes a baseline never event rate after cardiac surgery. Adverse patient outcomes and increased resource utilization resulting from never events emphasizes the need for quality improvement surrounding them. A better understanding of individual patient characteristics for those at risk can help in developing protocols to decrease occurrence rates.

  6. An analysis of post-event processing in social anxiety disorder.

    PubMed

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  7. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  8. Generalized enrichment analysis improves the detection of adverse drug events from the biomedical literature.

    PubMed

    Winnenburg, Rainer; Shah, Nigam H

    2016-06-23

    Identification of associations between marketed drugs and adverse events from the biomedical literature assists drug safety monitoring efforts. Assessing the significance of such literature-derived associations and determining the granularity at which they should be captured remains a challenge. Here, we assess how defining a selection of adverse event terms from MeSH, based on information content, can improve the detection of adverse events for drugs and drug classes. We analyze a set of 105,354 candidate drug adverse event pairs extracted from article indexes in MEDLINE. First, we harmonize extracted adverse event terms by aggregating them into higher-level MeSH terms based on the terms' information content. Then, we determine statistical enrichment of adverse events associated with drug and drug classes using a conditional hypergeometric test that adjusts for dependencies among associated terms. We compare our results with methods based on disproportionality analysis (proportional reporting ratio, PRR) and quantify the improvement in signal detection with our generalized enrichment analysis (GEA) approach using a gold standard of drug-adverse event associations spanning 174 drugs and four events. For single drugs, the best GEA method (Precision: .92/Recall: .71/F1-measure: .80) outperforms the best PRR based method (.69/.69/.69) on all four adverse event outcomes in our gold standard. For drug classes, our GEA performs similarly (.85/.69/.74) when increasing the level of abstraction for adverse event terms. Finally, on examining the 1609 individual drugs in our MEDLINE set, which map to chemical substances in ATC, we find signals for 1379 drugs (10,122 unique adverse event associations) on applying GEA with p < 0.005. We present an approach based on generalized enrichment analysis that can be used to detect associations between drugs, drug classes and adverse events at a given level of granularity, at the same time correcting for known dependencies among

  9. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    NASA Astrophysics Data System (ADS)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  10. Catchment process affecting drinking water quality, including the significance of rainfall events, using factor analysis and event mean concentrations.

    PubMed

    Cinque, Kathy; Jayasuriya, Niranjali

    2010-12-01

    To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.

  11. Analysis of single events in ultrarelativistic nuclear collisions: A new method to search for critical fluctuations

    SciTech Connect

    Stock, R.

    1995-07-15

    The upcoming generation of experiments with ultrarelativistic heavy nuclear projectiles, at the CERN SPS and at RHIC and LHC, will confront researchers with several thousand identified hadrons per event, suitable detectors provided. An analysis of individual events becomes meaningful concerning a multitude of hadronic signals thought to reveal a transient deconfinement phase transition, or the related critical precursor fluctuations. Transverse momentum spectra, the kaon to pion ratio, and pionic Bose-Einstein correlation are examined, showing how to separate the extreme, probably rare candidate events from the bulk of average events. This type of observables can already be investigated with the Pb beam of the SPS. The author then discusses single event signals that add to the above at RHIC and LHC energies, kaon interferometry, rapidity fluctuation, jet and {gamma} production.

  12. Stochastic Generation of Drought Events using Reconstructed Annual Streamflow Time Series from Tree Ring Analysis

    NASA Astrophysics Data System (ADS)

    Lopes, A.; Dracup, J. A.

    2011-12-01

    The statistical analysis of multiyear drought events in streamflow records is often restricted by the size of samples since only a few number of droughts events can be extracted from common river flow time series data. An alternative to those conventional datasets is the use of paleo hydrologic data such as streamflow time series reconstructed from tree ring analysis. In this study, we analyze the statistical characteristics of drought events present in a 1439 year long time series of reconstructed annual streamflow records at the Feather river inflow to the Oreville reservoir, California. Also, probabilistic distributions were used to describe duration and severity of drought events and the results were compared with previous studies that used only the observed streamflow data. Finally, a stochastic simulation model was developed to synthetically generate sequences of drought and high-flow events with the same characteristics of the paleo hydrologic record. The long term mean flow was used as the single truncation level to define 248 drought events and 248 high flow events with specific duration and severity. The longest drought and high flow events had 13 years (1471 to 1483) and 9 years of duration (1903 to 1911), respectively. A strong relationship between event duration and severity in both drought and high flow events were found so the longest droughts also corresponded to the more severe ones. Therefore, the events were classified by duration (in years) and probability distributions were fitted to the frequency distribution of drought and high flow severity for each duration. As a result, it was found that the gamma distribution describes well the frequency distribution of drought severities for all durations. For high flow events, the exponential distribution is more adequate for one year events while the gamma distribution is better suited for the longer events. Those distributions can be used to estimate the recurrence time of drought events according to

  13. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  14. An Application of Fuzzy Fault Tree Analysis to Uncontained Events of an Areo-Engine Rotor

    NASA Astrophysics Data System (ADS)

    Li, Yanfeng; Huang, Hong-Zhong; Zhu, Shun-Peng; Liu, Yu; Xiao, Ning-Cong

    2012-12-01

    Fault tree analysis is an important tool for system reliability analysis. Fuzzy fault tree analysis of uncontained events for aero-engine rotor is performed in this article. In addition, a new methodology based on fuzzy set theory is also used in fault tree analysis to quantify the failure probabilities of basic events. The theory of fuzzy fault tree is introduced firstly. Then the fault tree for uncontained events of an aero-engine rotor is established, in which the descending method is used to determine the minimal cut sets. Furthermore, the interval representation and calculation strategy is presented by using the symmetrical L-R type fuzzy number to describe the failure probability, and the resulting fault tree is analyzed quantitatively in the case study.

  15. Regression analysis of mixed recurrent-event and panel-count data

    PubMed Central

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.

    2014-01-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20, 1–42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. PMID:24648408

  16. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2017-04-01

    We analyze a cluster of intermediate deep events in the eastern part of the Hellenic subduction zone (HSZ), recorded during the the deployment of the temporary seismic network EGELADOS in order to gain a better understanding of geodynamic processes in the HSZ, in particular in the eastern part. The cluster consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. By using the three component similarity analysis, both spatial and temporal clustering of the recorded events is studied. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The cross-correlation coefficients at the single stations show a decrease in similarity with increasing epicentral distance as well as the effect of local heterogenities at particular stations, causing noticable differences in waveform similarities. However, highly similar events tend to happen at the prefered depth ranges between 120 to 150 km depth. The double-difference earthquake relocation software HypoDD was used to perform the event relocation. The results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential traveltimes obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, number of aftershocks and occurrence time of the aftershocks is varied, it is shown that the event-time distribution follows almost a random Poisson time distribution with a slightly increasing event rate without indications for substantial inter-event triggering. The spatial distribution of the cluster can be modelled by a two

  17. [Analysis of the impact of two typical air pollution events on the air quality of Nanjing].

    PubMed

    Wang, Fei; Zhu, Bin; Kang, Han-Qing; Gao, Jin-Hui; Wang, Yin; Jiang, Qi

    2012-10-01

    Nanjing and the surrounding area have experienced two consecutive serious air pollution events from late October to early November in 2009. The first event was long-lasting haze pollution, and the second event was resulted from the mixed impact of crop residue burning and local transportation. The effects of regional transport and local sources on the two events were discussed by cluster analysis, using surface meteorological observations, air pollution index, satellite remote sensing of fire hot spots data and back trajectory model. The results showed that the accumulation-mode aerosol number concentrations were higher than those of any other aerosol modes in the two pollution processes. The peak value of aerosol particle number concentrations shifted to large particle size compare with the previous studies in this area. The ratio of SO4(2-)/NO3(-) was 1.30 and 0.99, indicating that stationary sources were more important than traffic sources in the first event and the reverse in the second event. Affected by the local sources from east and south, the particle counts below 0.1 microm gradually accumulated in the first event. The second event was mainly affected by a short-distance transport from northeast and local sources from southwest, especially south, the concentration of aerosol particles was higher than those in other directions, indicating that the sources of crop residue burning were mainly in this direction.

  18. Making sense of root cause analysis investigations of surgery-related adverse events.

    PubMed

    Cassin, Bryce R; Barach, Paul R

    2012-02-01

    This article discusses the limitations of root cause analysis (RCA) for surgical adverse events. Making sense of adverse events involves an appreciation of the unique features in a problematic situation, which resist generalization to other contexts. The top priority of adverse event investigations must be to inform the design of systems that help clinicians to adapt and respond effectively in real time to undesirable combinations of design, performance, and circumstance. RCAs can create opportunities in the clinical workplace for clinicians to reflect on local barriers and identify enablers of safe and reliable outcomes. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics.

    PubMed

    Sun, Yi; Rangan, Aaditya V; Zhou, Douglas; Cai, David

    2012-02-01

    We present an event tree analysis of studying the dynamics of the Hodgkin-Huxley (HH) neuronal networks. Our study relies on a coarse-grained projection to event trees and to the event chains that comprise these trees by using a statistical collection of spatial-temporal sequences of relevant physiological observables (such as sequences of spiking multiple neurons). This projection can retain information about network dynamics that covers multiple features, swiftly and robustly. We demonstrate that for even small differences in inputs, some dynamical regimes of HH networks contain sufficiently higher order statistics as reflected in event chains within the event tree analysis. Therefore, this analysis is effective in discriminating small differences in inputs. Moreover, we use event trees to analyze the results computed from an efficient library-based numerical method proposed in our previous work, where a pre-computed high resolution data library of typical neuronal trajectories during the interval of an action potential (spike) allows us to avoid resolving the spikes in detail. In this way, we can evolve the HH networks using time steps one order of magnitude larger than the typical time steps used for resolving the trajectories without the library, while achieving comparable statistical accuracy in terms of average firing rate and power spectra of voltage traces. Our numerical simulation results show that the library method is efficient in the sense that the results generated by using this numerical method with much larger time steps contain sufficiently high order statistical structure of firing events that are similar to the ones obtained using a regular HH solver. We use our event tree analysis to demonstrate these statistical similarities.

  20. Single-Event Correlation Analysis of Quantum Key Distribution with Single-Photon Sources

    NASA Astrophysics Data System (ADS)

    Shangli Dong,; Xiaobo Wang,; Guofeng Zhang,; Liantuan Xiao,; Suotang Jia,

    2010-04-01

    Multiple photons exist that allow efficient eavesdropping strategies that threaten the security of quantum key distribution. In this paper, we theoretically discuss the photon correlations between authorized partners in the case of practical single-photon sources including a multiple-photon background. To investigate the feasibility of intercept-resend attacks, the cross correlations and the maximum intercept-resend ratio caused by the background signal are determined using single-event correlation analysis based on single-event detection.

  1. Rare-event Analysis and Computational Methods for Stochastic Systems Driven by Random Fields

    DTIC Science & Technology

    2014-12-29

    research develops asymptotic theories and numerical methods for computing rare- event probabilities associated with random fields and the associated...dynamics, neuroscience, fiber optics, astronomy , further civil engineering, engineer design, ocean-earth sciences, and so forth. We perform risk analysis...of such systems by investigating the asymptotic behavior of certain interesting rare events . For instance, 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND

  2. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.

  3. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  4. Hemorrhagic events in cancer patients treated with aflibercept: a meta-analysis.

    PubMed

    Peng, Ling; Bu, Zhibin; Zhou, Yun; Ye, Xianghua; Liu, Junfang; Zhao, Qiong

    2014-09-01

    Aflibercept (Ziv-aflibercept, VEGF Trap, AVE005) is an engineered protein that functions as a decoy receptor to bind vascular endothelial growth factor A (VEGF-A). Hemorrhagic events, including epistaxis, gastrointestinal bleeding, and pulmonary bleeding, is one of its major adverse effects, but the incidence rate and overall risk has not been systematically studied. Therefore, we conducted a meta-analysis of published clinical trials to investigate the incidence and relative risk of hemorrhagic events in cancer patients treated with aflibercept. Electronic databases including PubMed, Embase, Cochrane databases, and American Society of Clinical Oncology abstracts were searched. Eligible studies were phase II and III prospective clinical trials of cancer patients treated with aflibercept with toxicity profile on hemorrhagic events. Overall incidence rates, relative risk (RR), and 95 % confidence intervals (CI) were calculated using fixed or random effects models depending on the heterogeneity of the included studies. A total of 4,538 patients with a variety of solid tumors from 13 prospective clinical trials were included for the meta-analysis. The overall incidences of all-grade and high-grade hemorrhagic events in cancer patients were 22.1 % (95 % CI, 16.5-29.7 %) and 4.2 % (95 % CI, 3.9-4.6 %), respectively. The relative risks of hemorrhagic events of aflibercept compared to control were increased for all-grade (RR = 2.63; 95 % CI, 2.07-3.34) and high-grade (RR = 2.45, 95 % CI, 1.62-3.72) hemorrhagic events. The risk of developing high-grade hemorrhagic events with aflibercept was comparable to that of bevacizumab (RR = 1.26; 95 % CI, 0.89-1.79). Aflibercept is associated with an increased risk of developing hemorrhagic events in patients with solid tumors. Close monitoring and management of hemorrhagic events are recommended.

  5. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  6. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, M.; Meier, T. M.; Becker, D.; Brüstle, A.

    2015-12-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The events were recorded by the temporary seismic network EGELADOS deployed from September 2005 to March 2007. The network covered the entire Hellenic subduction zone and it consisted of 23 offshore and 56 onshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. The cluster of intermediate deep seismicity consists of 159 events with local magnitudes ranging from magnitude 0.2 to magnitude 4.1 at depths from 80 to 200 km. The events occur close to the top of the slab at an about 30 km thick zone. The spatio-temporal clustering is studied using three component similarity analysis.Single event locations obtained using the nonlinear location tool NonLinLoc are compared to relative relocations calculated using the double-difference earthquake relocation software HypoDD. The relocation is performed with both manual readings of onset times as well as with differential traveltimes obtained by separate cross-correlation of P- and S-waveforms. The three-component waveform cross-correlation was performed for all the events using data from 45 stations. The results of the similarity analysis are shown as a function of frequency for individual stations and averaged over the network. Average similarities between waveforms of all event pairs reveal a low number of highly similar events but a large number of moderate similarities. Interestingly, the single station similarities between the event pairs show (1) in general decreasing similarity with increasing epicentral distance, (2) reduced similarities for paths crossing boundaries of slab segments, and (3) the influence of strong local heterogeneity leading to a considerable reduction of waveform similarities e.g. in the center of the Santorini volcano.

  7. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2016-04-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The cluster recorded during the deployment of the temporary seismic network EGELADOS consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. The network itself consisted of 56 onshore and 23 offshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. It was deployed from September 2005 to March 2007 and it covered the entire HSZ. Here, both spatial and temporal clustering of the recorded events is studied by using the three component similarity analysis. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The results are shown as a function of frequency for individual stations and as averaged values over the network. The cross-correlation coefficients at the single stations show a decreasing similarity with increasing epicentral distance as well as the effect of local heterogeneities at particular stations, causing noticeable differences in waveform similarities. Event relocation was performed by using the double-difference earthquake relocation software HypoDD and the results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential travel times obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, portion and occurrence time of the aftershocks is varied, it is shown that the event

  8. Statistical analysis of solar energetic particle events and related solar activity

    NASA Astrophysics Data System (ADS)

    Dierckxsens, Mark; Patsou, Ioanna; Tziotziou, Kostas; Marsh, Michael; Lygeros, Nik; Crosby, Norma; Dalla, Silvia; Malandraki, Olga

    2013-04-01

    The FP7 COMESEP (COronal Mass Ejections and Solar Energetic Particles: forecasting the space weather impact) project is developing tools for forecasting geomagnetic storms and solar energetic particle (SEP) radiation storms. Here we present preliminary results on a statistical analysis of SEP events and their parent solar activity during Solar Cycle 23. The work aims to identify correlations between solar events and SEP events relevant for space weather, as well as to quantify SEP event probabilities for use within the COMESEP alert system. The data sample covers the SOHO era and is based on the SEPEM reference event list [http://dev.sepem.oma.be/]. Events are subdivided if separate enhancements are observed in higher energy channels as defined for the list of Cane et al (2010). Energetic Storm Particle (ESP) enhancements during these events are identified by associating ESP-like increases in the proton channels with shocks detected in ACE and WIND data. Their contribution has been estimated and subtracted from the proton fluxes. Relationships are investigated between solar flare parameters such as X-ray intensity and heliographic location on the one hand, and the probability of occurrence and strength of energetic proton flux increases on the other hand. The same exercise is performed using the velocity and width of coronal mass ejections to examine their SEP productiveness. Relationships between solar event characteristics and SEP event spectral indices and fluences are also studied, as well as enhancements in heavy ion fluxes measured by the SIS instrument on board the ACE spacecraft during the same event periods. This work has received funding from the European Commission FP7 Project COMESEP (263252).

  9. SMART Analysis Of A He II Explosive Event Observed With MOSES

    NASA Astrophysics Data System (ADS)

    Fox, Lewis; Kankelborg, C. C.; Thomas, R. J.

    2010-05-01

    Analysis of data from the MOSES (Multi-Order Solar EUV Spectrograph) sounding rocket has shown a Transition Region Explosive Event (TREE) in He II 304 Å with an unusual structure that defies conventional models of explosive events; the outflow jets are not collinear or anti-parallel. Results from our preliminary analysis of this event, performed using a tomographic parallax technique but without full inversions, are reported in a paper submitted to the Astrophysical Journal, presently under revision. Early results of inversions using the Smoothed Multiplicative Algebraic Reconstruction Technique (SMART), reported at SPD 2009, showed qualitative agreement with parallax analysis results but disagreement in magnitude of doppler velocities. We address this discrepancy with further refinement of the inversion technique and show how the discrepancy in velocity magnitude can be understood.

  10. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  11. Multivariate Statistical Modelling of Compound Events via Pair-Copula Constructions: Analysis of Floods in Ravenna

    NASA Astrophysics Data System (ADS)

    Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu

    2017-04-01

    Compound events are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. The conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present day and future climate, as well as the uncertainty estimates around such risk. The model includes meteorological predictors which provide insight into both the involved physical processes, and the temporal variability of CEs. Moreover, this model provides multivariate statistical downscaling of compound events. Downscaling of compound events is required to extend their risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events, or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk, in particular the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.

  12. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  13. Complete dose analysis of the November 12, 1960 solar cosmic ray event.

    PubMed

    Masley, A J; Goedeke, A D

    1963-01-01

    A detailed analysis of the November 12, 1960 solar cosmic ray event is presented as an integrated space flux and dose. This event is probably the most interesting solar cosmic ray event studied to date. Direct measurements were made of solar protons from 10 MeV to 6 GeV. During the double peaked high energy part of the event evidence is presented for the trapping of relativistic particles in a magnetic cloud. The proton energy spectrum is divided into 3 energy intervals, with separate energy power law exponents and time profiles carried through for each. The three groups are: (1) (30analysis are the results of rocket measurements which determined the spectrum down to 10 MeV twice during the event, balloon results from Fort Churchill and Minneapolis, earth satellite measurements, neutron monitors in New Hampshire and at both the North and South Pole and riometer results from Alaska and Kiruna, Sweden. The results are given in Table 1 [see text]. The results of our analyses of other solar cosmic ray events are also included with a general discussion of the solar flare hazards in space.

  14. Determination of microseismic event azimuth from S-wave splitting analysis

    NASA Astrophysics Data System (ADS)

    Yuan, Duo; Li, Aibing

    2017-02-01

    P-wave hodogram analysis has been the only reliable method to obtain microseismic event azimuths for one-well monitoring. However, microseismic data usually have weak or even no P-waves due to near double-couple focal mechanisms and limited ray path coverage, which causes large uncertainties in determined azimuths and event locations. To solve this problem, we take advantage of S-waves, which are often much stronger than P waves in microseismic data, and determine event azimuths by analyzing S-wave splitting data. This approach utilizes the positive correlation between the accuracy of event azimuth and the effectiveness of measuring S-wave splitting parameters and finds the optimal azimuth through a grid search. We have demonstrated that event azimuths can be well constrained from S-wave splitting analysis using both synthetic and field microseismic data. This method is less sensitive to noise than the routine P-wave hodogram method and provides a new way of determining microseismic event azimuths.

  15. Regression analysis of mixed panel count data with dependent terminal events.

    PubMed

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Analysis of Extreme Hydrologic Events in the NOAA National Water Model

    NASA Astrophysics Data System (ADS)

    Gochis, D. J.; Cosgrove, B.; McCreight, J. L.; Dugger, A. L.; Yu, W.; Yates, D. N.; Karsten, L. R.; Rafieeinasab, A.; Sampson, K. M.; Pan, L.; Liu, Y.

    2016-12-01

    The NOAA National Water Model is a new, continental-scale, operational, hydrologicprediction system designed to provide improved siutational awarenessand forecast guidance of flood and flash flood events. It operates at a spatial resolutiondesigned to capture 'neighborhood' scale hydrologic impacts related to flooding such as streamflow rate and velocity in local channel reaches as well as in representing otherflood related characteristics such as soil saturation and local inundation. The systembegan operational analysis and forecast production in May of 2016. In this presentation,a focused analysis on the capability of the National Water Model to forecast extreme hydrologic events associated with heavy rainfall and flooding is presented. A statistical and case-study approach is used to quantify to what degree the new system is capturing infrequent, extreme events and to identify dominant sources of error and limits ofpredictability in predicting such events. These analyses highlight the critical role of skillful precipitation prediction in predicting such events as well as in identifyingunderlying sources of error in model structure, antecedent hydrologicconditions and due to water management representation. Lastly,a pair of case studies of extreme event predictions will also be presented which highlight new development efforts underway to improve the period of skillful lead time prediction.

  17. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    SciTech Connect

    Lisbeth A. Mitchell

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  18. EVENT TREE ANALYSIS AT THE SAVANNAH RIVER SITE: A CASE HISTORY

    SciTech Connect

    Williams, R

    2009-05-25

    At the Savannah River Site (SRS), a Department of Energy (DOE) installation in west-central South Carolina there is a unique geologic stratum that exists at depth that has the potential to cause surface settlement resulting from a seismic event. In the past the particular stratum in question has been remediated via pressure grouting, however the benefits of remediation have always been debatable. Recently the SRS has attempted to frame the issue in terms of risk via an event tree or logic tree analysis. This paper describes that analysis, including the input data required.

  19. FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

    SciTech Connect

    Griffin, G.B.

    1988-12-01

    The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.

  20. Analysis of Pressurized Water Reactor Primary Coolant Leak Events Caused by Thermal Fatigue

    SciTech Connect

    C. L. Atwood; V. N. Shah; W. J. Galyean

    1999-09-01

    We present statistical analyses of pressurized water reactor (PWR) primary coolant leak events caused by thermal fatigue, and discuss their safety significance. Our worldwide data contain 13 leak events (through-wall cracking) in 3509 reactor-years, all in stainless steel piping with diameter less than 25 cm. Several types of data analysis show that the frequency of leak events (events per reactor-year) is increasing with plant age, and the increase is statistically significant. When an exponential trend model is assumed, the leak frequency is estimated to double every 8 years of reactor age, although this result should not be extrapolated to plants much older than 25 years. Difficulties in arresting this increase include lack of quantitative understanding of the phenomena causing thermal fatigue, lack of understanding of crack growth, and difficulty in detecting existing cracks.

  1. Low time resolution analysis of polar ice cores cannot detect impulsive nitrate events

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; Melott, A. L.; Laird, C. M.

    2014-12-01

    Ice cores are archives of climate change and possibly large solar proton events (SPEs). Wolff et al. (2012) used a single event, a nitrate peak in the GISP2-H core, which McCracken et al. (2001a) time associated with the poorly quantified 1859 Carrington event, to discredit SPE-produced, impulsive nitrate deposition in polar ice. This is not the ideal test case. We critique the Wolff et al. analysis and demonstrate that the data they used cannot detect impulsive nitrate events because of resolution limitations. We suggest reexamination of the top of the Greenland ice sheet at key intervals over the last two millennia with attention to fine resolution and replicate sampling of multiple species. This will allow further insight into polar depositional processes on a subseasonal scale, including atmospheric sources, transport mechanisms to the ice sheet, postdepositional interactions, and a potential SPE association.

  2. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    SciTech Connect

    Attrill, Gemma D. R.

    2010-07-20

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  3. Dispelling Illusions of Reflection: A New Analysis of the 2007 May 19 Coronal "Wave" Event

    NASA Astrophysics Data System (ADS)

    Attrill, Gemma D. R.

    2010-07-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified "reflections" are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  4. Differential gene expression analysis and network construction of recurrent cardiovascular events.

    PubMed

    Liao, Jiangquan; Chen, Zhong; He, Qinghong; Liu, Yongmei; Wang, Jie

    2016-02-01

    Recurrent cardiovascular events are vital to the prevention and treatment strategies in patients who have experienced primary cardiovascular events. However, the susceptibility of recurrent cardiovascular events varies among patients. Personalized treatment and prognosis prediction are urged. Microarray profiling of samples from patients with acute myocardial infarction (AMI), with or without recurrent cardiovascular events, were obtained from the Gene Expression Omnibus database. Bioinformatics analysis, including Gene Oncology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG), were used to identify genes and pathways specifically associated with recurrent cardiovascular events. A protein-protein interaction (PPI) network was constructed and visualized. A total of 1,329 genes were differentially expressed in the two group samples. Among them, 1,023 differentially expressed genes (DEGs; 76.98%) were upregulated in the recurrent cardiovascular events group and 306 DEGs (23.02%) were downregulated. Significantly enriched GO terms for molecular functions were nucleotide binding and nucleic acid binding, for biological processes were signal transduction and regulation of transcription (DNA-dependent), and for cellular component were cytoplasm and nucleus. The most significant pathway in our KEGG analysis was Pathways in cancer (P=0.000336681), and regulation of actin cytoskeleton was also significantly enriched (P=0.00165229). In the PPI network, the significant hub nodes were GNG4, MAPK8, PIK3R2, EP300, CREB1 and PIK3CB. The present study demonstrated the underlying molecular differences between patients with AMI, with and without recurrent cardiovascular events, including DEGs, their biological function, signaling pathways and key genes in the PPI network. With the use of bioinformatics and genomics these findings can be used to investigate the pathological mechanism, and improve the prevention and treatment of recurrent cardiovascular events.

  5. Two Point Autocorrelation Analysis of Auger Highest Energy Events Backtracked in Galactic Magnetic Field

    NASA Astrophysics Data System (ADS)

    Petrov, Yevgeniy

    2009-10-01

    Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.

  6. TRACEing the roots: a diagnostic "Tool for Retrospective Analysis of Critical Events".

    PubMed

    Hannawa, Annegret F; Roter, Debra L

    2013-11-01

    The lack of interdisciplinary clarity in the conceptualization of medical errors discourages effective incident analysis, particularly in the event of harmless outcomes. This manuscript integrates communication competence theory, the criterion of reasonability, and a typology of human error into a theoretically grounded Tool for Retrospective Analysis of Critical Events (TRACE) to overcome this limitation. A conceptual matrix synthesizing foundational elements pertinent to critical incident analysis from the medical, legal, bioethical and communication literature was developed. Vetting of the TRACE through focus groups and interviews was conducted to assure utility. The interviews revealed that TRACE may be useful in clinical settings, contributing uniquely to the current literature by framing critical incidents in regard to theory and the primary clinical contexts within which errors may occur. TRACE facilitates a comprehensive, theoretically grounded analysis of clinical performance, and identifies the intrapersonal and interpersonal factors that contribute to critical events. The TRACE may be used as (1) the means for a comprehensive, detailed analysis of human performance across five clinical practice contexts, (2) an objective "fact-check" after a critical event, (3) a heuristic tool to prevent critical incidents, and (4) a data-keeping system for quality improvement. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Brain Network Activation Analysis Utilizing Spatiotemporal Features for Event Related Potentials Classification

    PubMed Central

    Stern, Yaki; Reches, Amit; Geva, Amir B.

    2016-01-01

    The purpose of this study was to introduce an improved tool for automated classification of event-related potentials (ERPs) using spatiotemporally parcellated events incorporated into a functional brain network activation (BNA) analysis. The auditory oddball ERP paradigm was selected to demonstrate and evaluate the improved tool. Methods: The ERPs of each subject were decomposed into major dynamic spatiotemporal events. Then, a set of spatiotemporal events representing the group was generated by aligning and clustering the spatiotemporal events of all individual subjects. The temporal relationship between the common group events generated a network, which is the spatiotemporal reference BNA model. Scores were derived by comparing each subject's spatiotemporal events to the reference BNA model and were then entered into a support vector machine classifier to classify subjects into relevant subgroups. The reliability of the BNA scores (test-retest repeatability using intraclass correlation) and their utility as a classification tool were examined in the context of Target-Novel classification. Results: BNA intraclass correlation values of repeatability ranged between 0.51 and 0.82 for the known ERP components N100, P200, and P300. Classification accuracy was high when the trained data were validated on the same subjects for different visits (AUCs 0.93 and 0.95). The classification accuracy remained high for a test group recorded at a different clinical center with a different recording system (AUCs 0.81, 0.85 for 2 visits). Conclusion: The improved spatiotemporal BNA analysis demonstrates high classification accuracy. The BNA analysis method holds promise as a tool for diagnosis, follow-up and drug development associated with different neurological conditions. PMID:28066224

  8. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    NASA Astrophysics Data System (ADS)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  9. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  10. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  11. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    USDA-ARS?s Scientific Manuscript database

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  12. Literacy Events in the Homes, Churches, and Classroom of Bilingual Kindergartners: An Ethnographic Analysis.

    ERIC Educational Resources Information Center

    de Acosta, Martha; Volk, Diana

    This study describes and analyzes the emerging literacy of kindergartners in one bilingual classroom, and focuses on three Spanish-dominant Puerto Rican children in that class. Using a qualitative approach, the study investigated emergent literacy in the classroom, home, and church contexts. The unit of analysis was the literacy event, any…

  13. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  14. Rare events analysis of temperature chaos in the Sherrington-Kirkpatrick model

    NASA Astrophysics Data System (ADS)

    Billoire, Alain

    2014-04-01

    We investigate the question of temperature chaos in the Sherrington-Kirkpatrick spin glass model, applying a recently proposed rare events based data analysis method to existing Monte Carlo data. Thanks to this new method, temperature chaos is now observable for this model, even with the limited size systems that can currently be simulated.

  15. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    DTIC Science & Technology

    2005-05-01

    correct classification rate of 71.1 percent. In summary, the quantitative analysis sensitized us to look for deeper understanding of how communication ... issues influenced diagnostic testing errors, at what point in the procedure event chain the error occurred, how and why it occurred, and why harm

  16. Pitfalls in Pathways: Some Perspectives on Competing Risks Event History Analysis in Education Research

    ERIC Educational Resources Information Center

    Scott, Marc A.; Kennedy, Benjamin B.

    2005-01-01

    A set of discrete-time methods for competing risks event history analysis is presented. The approach used is accessible to the practitioner and the article describes the strengths, weaknesses, and interpretation of both exploratory and model-based tools. These techniques are applied to the impact of "nontraditional" enrollment features (working,…

  17. Leveraging Researcher Reflexivity to Consider a Classroom Event over Time: Reflexive Discourse Analysis of "What Counts"

    ERIC Educational Resources Information Center

    Anderson, Kate T.

    2017-01-01

    This article presents a reflexive and critical discourse analysis of classroom events that grew out of a cross-cultural partnership with a secondary school teacher in Singapore. I aim to illuminate how differences between researcher and teacher assumptions about what participation in classroom activities should look like came into high relief when…

  18. Uniting Secondary and Postsecondary Education: An Event History Analysis of State Adoption of Dual Enrollment Policies

    ERIC Educational Resources Information Center

    Mokher, Christine G.; McLendon, Michael K.

    2009-01-01

    This study, as the first empirical test of P-16 policy antecedents, reports the findings from an event history analysis of the origins of state dual enrollment policies adopted between 1976 and 2005. First, what characteristics of states are associated with the adoption of these policies? Second, to what extent do conventional theories on policy…

  19. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  20. Bayesian Estimation and Testing in Random Effects Meta-analysis of Rare Binary Adverse Events.

    PubMed

    Bai, Ou; Chen, Min; Wang, Xinlei

    Meta-analysis has been widely applied to rare adverse event data because it is very difficult to reliably detect the effect of a treatment on such events in an individual clinical study. However, it is known that standard meta-analysis methods are often biased, especially when the background incidence rate is very low. A recent work by Bhaumik et al. (2012) proposed new moment-based approaches under a natural random effects model, to improve estimation and testing of the treatment effect and the between-study heterogeneity parameter. It has been demonstrated that for rare binary events, their methods have superior performance to commonly-used meta-analysis methods. However, their comparison does not include any Bayesian methods, although Bayesian approaches are a natural and attractive choice under the random-effects model. In this paper, we study a Bayesian hierarchical approach to estimation and testing in meta-analysis of rare binary events using the random effects model in Bhaumik et al. (2012). We develop Bayesian estimators of the treatment effect and the heterogeneity parameter, as well as hypothesis testing methods based on Bayesian model selection procedures. We compare them with the existing methods through simulation. A data example is provided to illustrate the Bayesian approach as well.

  1. Statistical analysis of mixed recurrent event data with application to cancer survivor study.

    PubMed

    Zhu, Liang; Tong, Xingwei; Zhao, Hui; Sun, Jianguo; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2013-05-20

    Event history studies occur in many fields including economics, medical studies, and social science. In such studies concerning some recurrent events, two types of data have been extensively discussed in the literature. One is recurrent event data that arise if study subjects are monitored or observed continuously. In this case, the observed information provides the times of all occurrences of the recurrent events of interest. The other is panel count data, which occur if the subjects are monitored or observed only periodically. This can happen if the continuous observation is too expensive or not practical, and in this case, only the numbers of occurrences of the events between subsequent observation times are available. In this paper, we discuss a third type of data, which is a mixture of recurrent event and panel count data and for which there exists little literature. For regression analysis of such data, we present a marginal mean model and propose an estimating equation-based approach for estimation of regression parameters. We conduct a simulation study to assess the finite sample performance of the proposed methodology, and the results indicate that it works well for practical situations. Finally, we apply it to a motivating study on childhood cancer survivors. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Statistical analysis of mixed recurrent event data with application to cancer survivor study

    PubMed Central

    Zhu, Liang; Tong, Xingwei; Zhao, Hui; Sun, Jianguo; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.

    2014-01-01

    Event history studies occur in many fields including economics, medical studies and social science. In such studies concerning some recurrent events, two types of data have been extensively discussed in the literature. One is recurrent event data that arise if study subjects are monitored or observed continuously. In this case, the observed information provides the times of all occurrences of the recurrent events of interest. The other is panel count data, which occur if the subjects are monitored or observed only periodically. This can happen if the continuous observation is too expensive or not practical and in this case, only the numbers of occurrences of the events between subsequent observation times are available. In this paper, we discuss a third type of data, which is a mixture of recurrent event and panel count data and for which there exists little literature. For regression analysis of such data, a marginal mean model is presented and we propose an estimating equation-based approach for estimation of regression parameters. A simulation study is conducted to assess the finite sample performance of the proposed methodology and indicates that it works well for practical situations. Finally it is applied to a motivating study on childhood cancer survivors. PMID:23139023

  3. Adverse events with bismuth salts for Helicobacter pylori eradication: Systematic review and meta-analysis

    PubMed Central

    Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul

    2008-01-01

    AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870

  4. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    PubMed

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (P<.001) when compared with the lowest incidence in the 20-29 years-old group. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more

  5. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data

    PubMed Central

    Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-01-01

    Background Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. Objectives The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Methods Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. Results In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (P<.001) when compared with the lowest incidence in the 20-29 years-old group. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older

  6. An Analysis of the Muon-Like Events as the Fully Contained Events in the Super-Kamiokande through the Computer Numerical Experiment

    NASA Astrophysics Data System (ADS)

    Konishi, E.; Minorikawa, Y.; Galkin, V.I.; Ishiwata, M.; Nakamura, I.; Takahashi, N.; Kato, M.; Misaki, A.

    We analyze the muon-like Events(single ring image ) in the Super-Kamiokande (SK) by the Computer Numerical Experiment. Assuming the parameters of the neutrino oscillation obtained by the SK which characterize the type of the neutrino oscillation, we reproduce the zenith angle distribution of the muon-like events and compare it with the real distribution obtained by the SK . Also, we carry out the L/E analysis of the muon-like events by the Computer Numerical Experiment and compare it with that by the SK.

  7. The Time-Scaling Issue in the Frequency Analysis of Multidimensional Extreme Events

    NASA Astrophysics Data System (ADS)

    Gonzalez, J.; Valdes, J. B.

    2004-05-01

    Extreme events, such as droughts, appear as a period of time where water availability differ exceptionally from normal condition. Several characteristic of this departure from the normality are important in analyzing droughts recurrence frequency (e.g. magnitude, maximum intensity, duration, severity,.). In this kind of problems, the time scale applied in the analyses may become an issue when applying conventional frequency analysis approaches, generally based on the run theory. Usually few (one or two) main event-characteristics may be used, and when the time-scale changes in orders of magnitude, the derived frequency significantly changes, so poor characterization is achieved. For example, sort time-scale empathies characteristic such as intensity, but long time scale does magnitude. That variability may be overcome using a new approach, where events are threatened as in-time-multidimensional. This is studied in this work by comparing analysis applying conventional approach and the new multidimensional approach, and using from daily to decadal time scale. The improve in the performance of applying multidimensional technique, whit which frequency remains characterized even using different time-scale order of magnitude, results the main outcome of the study. The ability of implicitly incorporate all event feature in the time distribution, made possible characterize the events, independently of the time-scale, if the scale does not hide the extreme features.

  8. Synoptic Analysis of 2000-2005 Significant Snowfall Events on Mt. Kilimanjaro

    NASA Astrophysics Data System (ADS)

    Chan, R. Y.; Ammann, C. M.; Yin, J. H.

    2005-12-01

    Ice cores recovered from Mt. Kilimanjaro provide evidence of significant climate changes in the East African region over the past 10,000 years. However, the atmospheric processes that lead to snowfall on Kilimanjaro are poorly understood. Earlier studies have suggested that East African climate is dominated by the seasonal shift of the tropical precipitation bands, yet the key factors causing interannual precipitation variability remain unclear, particularly for the long rains season (March-May). To advance the understanding of modern East African climate, this study used data from a new station on top of Mt. Kilimanjaro and put them into a regional atmospheric circulation context. First, the in situ data were compared to global analysis products for testing their representation of the East African region. Overall, these data showed similar activity during corresponding days of snowfall on Kilimanjaro, indicating that snowfall events are likely related to regional precipitation. Second, the use of various global (re-) analysis products allowed the examination of commonalities between individual precipitation events on Kilimanjaro, and helped to identify the key precipitation-causing processes. Results showed distinct seasonality in precipitation, propagating from west to east during the long rains and east to west during the short rains (October-December). In addition, high magnitude snowfall events occurred under the conditions of low wind speed and high humidity. Therefore, high magnitude events may be a result of local convection, and these events may represent certain atmospheric conditions favorable for snowfall accumulation, but their representation of overall regional rainfall totals remains uncertain.

  9. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  10. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  11. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    PubMed

    Bighelli, Irene; Borghesani, Anna; Barbui, Corrado

    2017-01-01

    It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a) antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003); (b) antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221); (c) adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022). Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039). These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  12. A cross-sectional analysis of pharmaceutical industry-funded events for health professionals in Australia.

    PubMed

    Fabbri, Alice; Grundy, Quinn; Mintzes, Barbara; Swandari, Swestika; Moynihan, Ray; Walkom, Emily; Bero, Lisa A

    2017-06-30

    To analyse patterns and characteristics of pharmaceutical industry sponsorship of events for Australian health professionals and to understand the implications of recent changes in transparency provisions that no longer require reporting of payments for food and beverages. Cross-sectional analysis. 301 publicly available company transparency reports downloaded from the website of Medicines Australia, the pharmaceutical industry trade association, covering the period from October 2011 to September 2015. Forty-two companies sponsored 116 845 events for health professionals, on average 608 per week with 30 attendees per event. Events typically included a broad range of health professionals: 82.0% included medical doctors, including specialists and primary care doctors, and 38.3% trainees. Oncology, surgery and endocrinology were the most frequent clinical areas of focus. Most events (64.2%) were held in a clinical setting. The median cost per event was $A263 (IQR $A153-1195) and over 90% included food and beverages. Over this 4-year period, industry-sponsored events were widespread and pharmaceutical companies maintained a high frequency of contact with health professionals. Most events were held in clinical settings, suggesting a pervasive commercial presence in everyday clinical practice. Food and beverages, known to be associated with changes to prescribing practice, were almost always provided. New Australian transparency provisions explicitly exclude meals from the reporting requirements; thus, a large proportion of potentially influential payments from pharmaceutical companies to health professionals will disappear from public view. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Root cause analysis of serious adverse events among older patients in the Veterans Health Administration.

    PubMed

    Lee, Alexandra; Mills, Peter D; Neily, Julia; Hemphill, Robin R

    2014-06-01

    Preventable adverse events are more likely to occur among older patients because of the clinical complexity of their care. The Veterans Health Administration (VHA) National Center for Patient Safety (NCPS) stores data about serious adverse events when a root cause analysis (RCA) has been performed. A primary objective of this study was to describe the types of adverse events occurring among older patients (age > or = 65 years) in Department of Veterans Affairs (VA) hospitals. Secondary objectives were to determine the underlying reasons for the occurrence of these events and report on effective action plans that have been implemented in VA hospitals. In a retrospective, cross-sectional review, RCA reports were reviewed and outcomes reported using descriptive statistics for all VA hospitals that conducted an RCA for a serious geriatric adverse event from January 2010 to January 2011 that resulted in sustained injury or death. The search produced 325 RCA reports on VA patients (age > or = 65 years). Falls (34.8%), delays in diagnosis and/or treatment (11.7%), unexpected death (9.9%), and medication errors (9.0%) were the most commonly reported adverse events among older VA patients. Communication was the most common underlying reason for these events, representing 43.9% of reported root causes. Approximately 40% of implemented action plans were judged by local staff to be effective. The RCA process identified falls and communication as important themes in serious adverse events. Concrete actions, such as process standardization and changes to communication, were reported by teams to yield some improvement. However, fewer than half of the action plans were reported to be effective. Further research is needed to guide development and implementation of effective action plans.

  14. Impact of adverse events on prescribing warfarin in patients with atrial fibrillation: matched pair analysis.

    PubMed

    Choudhry, Niteesh K; Anderson, Geoffrey M; Laupacis, Andreas; Ross-Degnan, Dennis; Normand, Sharon-Lise T; Soumerai, Stephen B

    2006-01-21

    To quantify the influence of physicians' experiences of adverse events in patients with atrial fibrillation who were taking warfarin. Population based, matched pair before and after analysis. Database study in Ontario, Canada. The physicians of patients with atrial fibrillation admitted to hospital for adverse events (major haemorrhage while taking warfarin and thromboembolic strokes while not taking warfarin). Pairs of other patients with atrial fibrillation treated by the same physicians were selected. Odds of receiving warfarin by matched pairs of a given physician's patients (one treated after and one treated before the event) were compared, with adjustment for stroke and bleeding risk factors that might also influence warfarin use. The odds of prescriptions for angiotensin converting enzyme (ACE) inhibitor before and after the event was assessed as a neutral control. For the 530 physicians who had a patient with an adverse bleeding event (exposure) and who treated other patients with atrial fibrillation during the 90 days before and the 90 days after the exposure, the odds of prescribing warfarin was 21% lower for patients after the exposure (adjusted odds ratio 0.79, 95% confidence interval 0.62 to 1.00). Greater reductions in warfarin prescribing were found in analyses with patients for whom more time had elapsed between the physician's exposure and the patient's treatment. There were no significant changes in warfarin prescribing after a physician had a patient who had a stroke while not on warfarin or in the prescribing of ACE inhibitors by physicians who had patients with either bleeding events or strokes. A physician's experience with bleeding events associated with warfarin can influence prescribing warfarin. Adverse events that are possibly associated with underuse of warfarin may not affect subsequent prescribing.

  15. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  16. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  17. Analysis of the longitudinal dependence of the downstream fluence of large solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Pacheco, Daniel; Sanahuja, Blai; Aran, Angels; Agueda, Neus; Jiggens, Piers

    2016-07-01

    Simulations of the solar energetic particle (SEP) intensity-time profiles are needed to estimate the radiation environment for interplanetary missions. At present, the physics-based models applied for such a purpose, and including a moving source of particles, are not able to model the portion of the SEP intensity enhancement occurring after the coronal/interplanetary shock crossing by the observer (a.k.a. the downstream region). This is the case, for example, of the shock-and-particle model used to build the SOLPENCO2 code. SOLPENCO2 provides the statistical modelling tool developed in the ESA/SEPEM project for interplanetary missions with synthetic SEP event simulations for virtual spacecraft located at heliocentric distances between 0.2 AU and 1.6 AU (http://dev.sepem.oma.be/). In this work we present an analysis of 168 individual SEP events observed at 1 AU from 1988 to 2013. We identify the solar eruptive phenomena associated with these SEP events, as well as the in-situ passage of interplanetary shocks. For each event, we quantify the amount of fluence accounted in the downstream region, i.e. after the passage of the shock, at the 11 SEPEM reference energy channels (i.e., from 5 to 300 MeV protons). First, from the subset of SEP events simultaneously detected by near Earth spacecraft (using SEPEM reference data) and by one of the STEREO spacecraft, we select those events for which the downstream region can be clearly determined. From the 8 selected multi-spacecraft events, we find that the western observations of each event have a minor downstream contribution than their eastern counterpart, and that the downstream-to-total fluence ratio of these events decreases as a function of the energy. Hence, there is a variation of the downstream fluence with the heliolongitude in SEP events. Based on this result, we study the variation of the downstream-to-total fluence ratios of the total set of individual events. We confirm the eastern-to-western decrease of the

  18. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  19. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  20. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and... at (301) 492-3446. FOR FURTHER INFORMATION CONTACT: Song-Hua Shen, Division of Risk Analysis,...

  1. Peri-event cross-correlation over time for analysis of interactions in neuronal firing.

    PubMed

    Paiva, António R C; Park, Il; Sanchez, Justin C; Príncipe, José C

    2008-01-01

    Several methods have been described in the literature to verify the presence of couplings between neurons in the brain. In this paper we introduce the peri-event cross-correlation over time (PECCOT) to describe the interaction among the two neurons as a function of the event onset. Instead of averaging over time, the PECCOT averages the cross-correlation over instances of the event. As a consequence, the PECCOT is able to characterize with high temporal resolution the interactions over time among neurons. To illustrate the method, the PECCOT is applied to a simulated dataset and for analysis of synchrony in recordings of a rat performing a go/no go behavioral lever press task. We verify the presence of synchrony before the lever press time and its suppression afterwards.

  2. The use of significant event analysis and personal development plans in developing CPD: a pilot study.

    PubMed

    Wright, P D; Franklin, C D

    2007-07-14

    This paper describes the work undertaken by the Postgraduate Primary Care Trust (PCT) Dental Tutor for South Yorkshire and East Midlands Regional Postgraduate Dental Education Office during the first year of a two-year pilot. The tutor has special responsibility for facilitating the writing of Personal Development Plans (PDPs) and the introduction of Significant Event Analysis to the 202 general dental practitioners in the four Sheffield PCTs. Data were collected on significant events and the educational needs highlighted as a result. A hands-on workshop format was used in small practice groups and 45% of Sheffield general dental practitioners now have written PDPs compared with a 16% national average. A library of significant events has also been collated from the data collected.

  3. Increased risk of cerebrovascular events in patients with cancer treated with bevacizumab: a meta-analysis.

    PubMed

    Zuo, Pei-Yuan; Chen, Xing-Lin; Liu, Yu-Wei; Xiao, Chang-Liang; Liu, Cheng-Yun

    2014-01-01

    Arterial ischemia and hemorrhage are associated with bevacizumab, an inhibitor of vascular endothelial growth factor that is widely used to treat many types of cancers. As specific types of arterial ischemia and hemorrhage, cerebrovascular events such as central nervous system (CNS) ischemic events and CNS hemorrhage are serious adverse events. However, increased cerebrovascular events have not been uniformly reported by previous studies. New randomized controlled trials (RCTs) have been reported in recent years and we therefore conducted an up-to-date meta-analysis of RCTs to fully characterize the risk of cerebrovascular events with bevacizumab. We searched the databases of PubMed, Web of Science, and the American Society of Clinical Oncology conferences to identify relevant clinical trials up to February 2014. Eligible studies included prospective RCTs that directly compared patients with cancer treated with and without bevacizumab. A total of 12,917 patients from 17 RCTs were included in our analysis. Patients treated with bevacizumab had a significantly increased risk of cerebrovascular events compared with patients treated with control medication, with a relative risk of 3.28 (95% CI, 1.97-5.48). The risks of CNS ischemic events and CNS hemorrhage were increased compared with control, with RRs of 3.22 (95% CI, 1.71-6.07) and 3.09 (95% CI, 1.36-6.99), respectively. Risk varied with the bevacizumab dose, with RRs of 3.97 (95% CI, 2.15-7.36) and 1.96 (95% CI, 0.76-5.06) at 5 and 2.5 mg/kg/week, respectively. Higher risks were observed in patients with metastatic colorectal cancer (RR, 6.42; 95% CI, 1.76-35.57), and no significant risk was observed in other types of tumors. In conclusion, the addition of bevacizumab significantly increased the risk of cerebrovascular events compared with controls, including CNS ischemic events and CNS hemorrhage. The risk may vary with bevacizumab dose and tumor type.

  4. Increased Risk of Cerebrovascular Events in Patients with Cancer Treated with Bevacizumab: A Meta-Analysis

    PubMed Central

    Liu, Yu-Wei; Xiao, Chang-Liang; Liu, Cheng-Yun

    2014-01-01

    Arterial ischemia and hemorrhage are associated with bevacizumab, an inhibitor of vascular endothelial growth factor that is widely used to treat many types of cancers. As specific types of arterial ischemia and hemorrhage, cerebrovascular events such as central nervous system (CNS) ischemic events and CNS hemorrhage are serious adverse events. However, increased cerebrovascular events have not been uniformly reported by previous studies. New randomized controlled trials (RCTs) have been reported in recent years and we therefore conducted an up-to-date meta-analysis of RCTs to fully characterize the risk of cerebrovascular events with bevacizumab. We searched the databases of PubMed, Web of Science, and the American Society of Clinical Oncology conferences to identify relevant clinical trials up to February 2014. Eligible studies included prospective RCTs that directly compared patients with cancer treated with and without bevacizumab. A total of 12,917 patients from 17 RCTs were included in our analysis. Patients treated with bevacizumab had a significantly increased risk of cerebrovascular events compared with patients treated with control medication, with a relative risk of 3.28 (95% CI, 1.97–5.48). The risks of CNS ischemic events and CNS hemorrhage were increased compared with control, with RRs of 3.22 (95% CI, 1.71–6.07) and 3.09 (95% CI, 1.36–6.99), respectively. Risk varied with the bevacizumab dose, with RRs of 3.97 (95% CI, 2.15–7.36) and 1.96 (95% CI, 0.76–5.06) at 5 and 2.5 mg/kg/week, respectively. Higher risks were observed in patients with metastatic colorectal cancer (RR, 6.42; 95% CI, 1.76–35.57), and no significant risk was observed in other types of tumors. In conclusion, the addition of bevacizumab significantly increased the risk of cerebrovascular events compared with controls, including CNS ischemic events and CNS hemorrhage. The risk may vary with bevacizumab dose and tumor type. PMID:25025282

  5. Root Cause Analysis of Ambulatory Adverse Drug Events That Present to the Emergency Department.

    PubMed

    Gertler, Sarah A; Coralic, Zlatan; López, Andrea; Stein, John C; Sarkar, Urmimala

    2016-09-01

    Adverse drug events (ADEs) among patients self-administering medications in home/community settings are a common cause of emergency department (ED) visits, but the causes of these ambulatory ADEs remain unclear. Root cause analysis, rarely applied in outpatient settings, may reveal the underlying factors that contribute to adverse events. To elicit patient and provider perspectives on ambulatory ADEs and apply root cause analysis methodology to identify cross-cutting themes among these events. Emergency department clinical pharmacists screened, identified, and enrolled a convenience sample of adult patients 18 years or older who presented to a single, urban, academic ED with symptoms or diagnoses consistent with suspected ADEs. Semistructured phone interviews were conducted with the patients and their providers. We conducted a qualitative analysis. We applied a prespecified version of the injury prevention framework (deductive coding), identifying themes relating to the agent (drug), host (patient), and environment (social and health systems). These themes were used to construct a root cause analysis for each ADE. From 18 interviews overall, we identified the following themes within the injury prevention framework. Agent factors included high-risk drugs, narrow therapeutic indices, and uncommon severe effects. Host factors included patient capacity or understanding of how to use medications, awareness of side effects, mistrust of the medical system, patients with multiple comorbidities, difficult risk-benefit assessments, and high health-care users. Environmental factors included lack of social support, and health systems issues included access to care, encompassing medication availability, access to specialists, and a lack of continuity and communication among prescribing physicians. Root cause analysis revealed multiple underlying factors relating to agent, host, and environment for each event. Patient and physician perspectives can inform a root cause analysis

  6. The logic of surveillance guidelines: an analysis of vaccine adverse event reports from an ontological perspective.

    PubMed

    Courtot, Mélanie; Brinkman, Ryan R; Ruttenberg, Alan

    2014-01-01

    When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero.

  7. Shared parameter models for the joint analysis of longitudinal data and event times.

    PubMed

    Vonesh, Edward F; Greene, Tom; Schluchter, Mark D

    2006-01-15

    Longitudinal studies often gather joint information on time to some event (survival analysis, time to dropout) and serial outcome measures (repeated measures, growth curves). Depending on the purpose of the study, one may wish to estimate and compare serial trends over time while accounting for possibly non-ignorable dropout or one may wish to investigate any associations that may exist between the event time of interest and various longitudinal trends. In this paper, we consider a class of random-effects models known as shared parameter models that are particularly useful for jointly analysing such data; namely repeated measurements and event time data. Specific attention will be given to the longitudinal setting where the primary goal is to estimate and compare serial trends over time while adjusting for possible informative censoring due to patient dropout. Parametric and semi-parametric survival models for event times together with generalized linear or non-linear mixed-effects models for repeated measurements are proposed for jointly modelling serial outcome measures and event times. Methods of estimation are based on a generalized non-linear mixed-effects model that may be easily implemented using existing software. This approach allows for flexible modelling of both the distribution of event times and of the relationship of the longitudinal response variable to the event time of interest. The model and methods are illustrated using data from a multi-centre study of the effects of diet and blood pressure control on progression of renal disease, the modification of diet in renal disease study.

  8. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  9. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  10. Accuracy analysis of measurements on a stable power-law distributed series of events

    NASA Astrophysics Data System (ADS)

    Matthews, J. O.; Hopcraft, K. I.; Jakeman, E.; Siviour, G. B.

    2006-11-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation.

  11. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    PubMed

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  12. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  13. A Unified Scenario of Near-Earth Substorm Onset: Analysis of THEMIS Events

    NASA Astrophysics Data System (ADS)

    Zhu, P.; Raeder, J.; Bhattacharjee, A.; Germaschewski, K.; Hegna, C.

    2008-12-01

    We propose an alternative scenario for the substorm onset process, based on ideal ballooning stability analysis of the near-Earth plasma sheet during recent THEMIS substorm events. In this scenario, the ballooning instability is initiated by the magnetic reconnection in the near-Earth plasma sheet, which in turn directly contributes to the trigger of a full onset. Using the solar wind data from WIND satellite observation for the substorm event as an input at dayside, we reconstructed a sequence of global magnetospheric configurations around the substorm onset by means of OpenGGCM simulation. These simulations have reproduced most of the salient features, including the onset timing, observed in the THEMIS substorm events [Raeder et al, 2008]. The ballooning instability criterion and growth rate are evaluated for the near-Earth plasma sheet region where the configuration satisfies a quasi-static equilibrium condition. Our analysis of the evolution of the near-Earth magnetotail region during the substorm events reveals a correlation between the breaching of the ballooning stability condition and the substorm onset in both temporal and spatial domains. The analysis suggests that the Earthward bulk plasma flow induced by the reconnection event in the near- Earth plasma sheet, leads to the pressure build-up and creates a favorable condition for the initiation of the ballooning instability in that same region. This new alternative scenario further elaborates earlier conjectures on the roles of reconnection and ballooning instability [Bhattacharjee et al, 1998], and has the potential to integrate both the near-Earth neutral-line model [McPherron et al, 1973] and the near-Earth current-sheet- disruption model [Lui et al, 1988] into a unified model of the near-Earth substorm onset. Research supported by U.S. NSF Grant No. ATM-0542954.

  14. Efficacy and adverse events of cold vs hot polypectomy: A meta-analysis

    PubMed Central

    Fujiya, Mikihiro; Sato, Hiroki; Ueno, Nobuhiro; Sakatani, Aki; Tanaka, Kazuyuki; Dokoshi, Tatsuya; Fujibayashi, Shugo; Nomura, Yoshiki; Kashima, Shin; Gotoh, Takuma; Sasajima, Junpei; Moriichi, Kentaro; Watari, Jiro; Kohgo, Yutaka

    2016-01-01

    AIM: To compare previously reported randomized controlled studies (RCTs) of cold and hot polypectomy, we systematically reviewed and clarify the utility of cold polypectomy over hot with respect to efficacy and adverse events. METHODS: A meta-analysis was conducted to evaluate the predominance of cold and hot polypectomy for removing colon polyps. Published articles and abstracts from worldwide conferences were searched using the keywords “cold polypectomy”. RCTs that compared either or both the effects or adverse events of cold polypectomy with those of hot polypectomy were collected. The patients’ demographics, endoscopic procedures, No. of examined lesions, lesion size, macroscopic and histologic findings, rates of incomplete resection, bleeding amount, perforation, and length of procedure were extracted from each study. A forest plot analysis was used to verify the relative strength of the effects and adverse events of each procedure. A funnel plot was generated to assess the possibility of publication bias. RESULTS: Ultimately, six RCTs were selected. No significant differences were noted in the average lesion size (less than 10 mm) between the cold and hot polypectomy groups in each study. Further, the rates of complete resection and adverse events, including delayed bleeding, did not differ markedly between cold and hot polypectomy. The average procedural time in the cold polypectomy group was significantly shorter than in the hot polypectomy group. CONCLUSION: Cold polypectomy is a time-saving procedure for removing small polyps with markedly similar curability and safety to hot polypectomy. PMID:27340361

  15. The association between B vitamins supplementation and adverse cardiovascular events: a meta-analysis

    PubMed Central

    Li, Wen-Feng; Zhang, Dan-Dan; Xia, Ji-Tian; Wen, Shan-Fan; Guo, Jun; Li, Zi-Cheng

    2014-01-01

    This study is to explore the association of adverse cardiovascular events with B vitamins supplementation. Rev.Man 5.1 and Stata 11.0 software were applied for the meta-analysis. The number of cardiovascular events was collected and calculated using indicates of odds ratio and 95% confidence intervals in a fixed-effects or a random-effects model when appropriate. The study includes 15 studies which consists of 37,358 study objects (experimental group: 19,601; control group: 17,757). This study showed that the pooled ORs was 1.01 (95% CI = 0.96~1.06, P > 0.05) for objects with Experimental group (B vitamins supplementation) vs. Control group (placebo or regular treatment), which suggests no significant differences were found in the overall effect of the number of cardiovascular events between the two groups. Further stratification of subgroup analysis indicates no significant differences were found between the two groups as well. There were also no publication bias existing by the Egger’s linear regression test (P > 0.05). Our result indicates that the number of cardiovascular events in experimental group using B vitamins supplementation during the treatment is equal to placebo or regular treatment group thus further studies is necessary. PMID:25232372

  16. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  17. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine

  18. Fixation based event-related fmri analysis: using eye fixations as events in functional magnetic resonance imaging to reveal cortical processing during the free exploration of visual images.

    PubMed

    Marsman, Jan Bernard C; Renken, Remco; Velichkovsky, Boris M; Hooymans, Johanna M M; Cornelissen, Frans W

    2012-02-01

    Eye movements, comprising predominantly fixations and saccades, are known to reveal information about perception and cognition, and they provide an explicit measure of attention. Nevertheless, fixations have not been considered as events in the analyses of data obtained during functional magnetic resonance imaging (fMRI) experiments. Most likely, this is due to their brevity and statistical properties. Despite these limitations, we used fixations as events to model brain activation in a free viewing experiment with standard fMRI scanning parameters. First, we found that fixations on different objects in different task contexts resulted in distinct cortical patterns of activation. Second, using multivariate pattern analysis, we showed that the BOLD signal revealed meaningful information about the task context of individual fixations and about the object being inspected during these fixations. We conclude that fixation-based event-related (FIBER) fMRI analysis creates new pathways for studying human brain function by enabling researchers to explore natural viewing behavior.

  19. Analysys of intensity time profile of solar proton events in SOLAR CYCLE 23

    NASA Astrophysics Data System (ADS)

    Ochelkov, Yurij

    The intensity time profile of proton fluxes in SEP events of SOLAR CYCLE 23 on data from GOES was studied. In our study the time interval in which proton flux intensity changes from 1/10 of peak intensity to peak intensity both in rise and decay stages is considered as a main phase of time evolution of SEP events. The study of time profiles in this phase is very important to help us to understand which propagation mechanism determines the proton peak intensities in different events and to establish SEP event classification. We propose the next parameters to use for quantitative analysis of time profiles: 1) The time width parameters of time profile, such as the mean width of time profile for intensity in logarithmic scale for main phase, for rise and decay stages of main phase; the same parameters for stage of flux change in limits from 1/3.16 of peak intensity to peak intensity . 2) The propagation parameters of time profile, such as ratio of time to maximum (the time interval between the injection onset time and the peak intensity time) to all time width parameters. We calculated the theoretical values of all these parameters for diffusive propagation with instant and long injection for different dependencies diffusion coefficient from distance. We constructed the distribution of propagation parameter for main phase for all events of SOLAR CYCLE 23 and found that distribution has a peak near value 0.43 for fluxes with proton energy greater than 60 and 100 MeV. Roughly 20% events have the propagation parameter which is equal 0.43 with accuracy 10%. We propose to consider such time profile as a basic time profile. It is formed by diffusive process with diffusion coefficient with linear dependence from distance and instant injection. On our mind the declination from basic profile for great bulk of events depends on long injection from shock waves and trapping proton in structures of the heliosphere. Practically all profiles for proton energy greater than 10 MeV are

  20. Identification and Analysis of Storm Tracks Associated with Extreme Flood Events in Southeast and South Brazil

    NASA Astrophysics Data System (ADS)

    Lima, Carlos; Lopes, Camila

    2015-04-01

    Flood is the main natural disaster in Brazil, practically affecting all regions in the country and causing several economical damages and losses of lives. In traditional hydrology, the study of floods is focused on a frequency analysis of the extreme events and on the fit of statistical models to define flood quantiles associated with pre-specified return periods or exceedance probabilities. The basic assumptions are randomness and temporal stationarity of the streamflow data. In this paper we seek to advance the traditional flood frequency studies by using the ideas developed in the area of flood hydroclimatology, which is defined as the study of climate in the flood framework, i.e., the understanding of long term changes in the frequency, magnitude, duration, location and seasonality of floods as driven by the interaction of regional and global patterns of the ocean and atmospheric circulation. That being said, flood events are not treated as random and stationary but resulting from a causal chain, where exceptional floods in water basins from different sizes are related with large scale anomalies in the atmospheric and ocean circulation patterns. Hence, such studies enrich the classical assumption of stationary flood hazard adopted in most flood frequency studies through a formal consideration of the physical mechanisms responsible for the generation of extreme floods, which implies recognizing the natural climate variability due to persistent and oscillatory regimes (e.g. ENSO, NAO, PDO) in many temporal scales (interannual, decadal, etc), and climate fluctuations in response to anthropogenic changes in the atmosphere, soil use and vegetation cover. Under this framework and based on streamflow gauge and reanalysis data, we identify and analyze here the storm tracks that preceded extreme events of floods in key flood-prone regions of the country (e.g. Parana and Rio Doce River basins) with such events defined based on the magnitude, duration and volume of the

  1. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2017-04-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  2. Through the eyes of the other: using event analysis to build cultural competence.

    PubMed

    Kozub, Mary L

    2013-07-01

    Cultural competence requires more than the accumulation of information about cultural groups. An awareness of the nurse's own culture, beliefs, and values is considered by several transcultural nursing theorists to be essential to the development of cultural competence and the provision of quality patient care. Using Transformational Learning Theory, this article describes event analysis, an active learning tool that uses the nurse's own practice to explore multiple perspectives of an experience, with the goal of transforming the nurse's approach to diversity from an ethnocentric stance, to one of tolerance and consideration for the patient's needs, values, and beliefs with regard to quality of care. Furthermore, the application of the event analysis to multiple settings, including inpatient, educational, and administrative environments, is discussed.

  3. Contextual determinants of condom use among female sex exchangers in East Harlem, NYC: an event analysis.

    PubMed

    McMahon, James M; Tortu, Stephanie; Pouget, Enrique R; Hamid, Rahul; Neaigus, Alan

    2006-11-01

    Recent studies have revealed a variety of contexts involving HIV risk behaviors among women who exchange sex for money or drugs. Event analysis was used to identify the individual, relationship, and contextual factors that contribute to these high-risk sex exchange practices. Analyses were conducted on data obtained from 155 drug-using women who reported details of their most recent sex exchange event with male clients. The majority of sex exchange encounters (78%) involved consistent condom use. In multivariable analysis, protective behavior was associated primarily with situational and relationship variables, such as exchange location, substance use, sexual practices, and respondent/client discussion and control. In order to inform HIV prevention programs targeted to women sex exchangers, further research is needed on the contextual determinants of risk, especially with regard to condom-use negotiation and factors involving substance use that adversely affect women's ability to manage protective behavior in the context of sex exchange.

  4. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  5. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  6. A meta-analysis of transcatheter closure of patent foramen ovale versus medical therapy for prevention of recurrent thromboembolic events in patients with cryptogenic cerebrovascular events.

    PubMed

    Pineda, Andrés M; Nascimento, Francisco O; Yang, Solomon C; Kirtane, Ajay J; Sommer, Robert J; Beohar, Nirat

    2013-11-15

    We sought to perform a meta-analysis of randomized controlled trials (RCTs) comparing percutaneous patent-foramen-ovale (PFO) closure with medical therapy for preventing recurrent thromboembolic events after cryptogenic stroke. Observational studies suggested that transcatheter PFO closure decreases recurrent events after cryptogenic stroke; however, three recent RCTs failed to demonstrate such benefit. Trials were identified from the PubMed and Cochrane databases. Primary endpoint was the composite of transient ischemic attack (TIA) and ischemic cerebrovascular events (CVA). Both intention-to-treat (ITT) and as-treated analyses (AT) were performed. Three RCTs met inclusion criteria. The pooled data provided 2,303 patients, of which 1,150 were in the PFO closure group and 1,153 in the medical therapy group. In the ITT analysis, there were 43 events (3.7%) of the composite end point in the closure group compared with 61 events (5.3%) in the medical therapy group, with a trend in favor of the PFO closure (OR = 0.70; 95% CI, 0.47-1.05, P = 0.08). The incidences of TIA, ischemic CVA, and bleeding were not statistically different between the groups. There was a trend for the more frequent occurrence of atrial fibrillation in the PFO closure group (OR = 3.29; 95% CI, 0.86-12.60, P = 0.08). In the AT analysis, the composite end point was significantly less frequent in the PFO closure group (OR = 0.62; 95% CI, 0.41-0.94, P = 0.02). In this meta-analysis of contemporary RCTs, successful transcatheter closure of PFO might be more effective than medical therapy alone for the prevention of recurrent thromboembolic events. Copyright © 2013 Wiley Periodicals, Inc.

  7. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    DTIC Science & Technology

    2013-06-01

    critical events. In a detailed analysis of the email corpus of the Enron Corporation, Diesner and Carley (2005; see also Murshed et al. 2007) found that...established contacts and formal roles. The Enron crisis is instructive as a network with a critical period of failure. Other researchers have also found...Diesner, J., Frantz, T. L., & Carley, K. M. (2005). Communication networks from the Enron email corpus “It’s always about the people. Enron is no

  8. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  9. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis.

  10. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  11. Analysis of 415 adverse events in dental practice in Spain from 2000 to 2010

    PubMed Central

    Perea-Pérez, Bernardo; Labajo-González, Elena; Santiago-Sáez, Andrés; Albarrán-Juan, Elena; Villa-Vigil, Alfonso

    2014-01-01

    Introduction: The effort to increase patient safety has become one of the main focal points of all health care professions, despite the fact that, in the field of dentistry, initiatives have come late and been less ambitious. The main objective of patient safety is to avoid preventable adverse events to the greatest extent possible and to limit the negative consequences of those which are unpreventable. Therefore, it is essential to ascertain what adverse events occur in each dental care activity in order to study them in-depth and propose measures for prevention. Objectives: To ascertain the characteristics of the adverse events which originate from dental care, to classify them in accordance with type and origin, to determine their causes and consequences, and to detect the factors which facilitated their occurrence. Material and Methods: This study includes the general data from the series of adverse dental vents of the Spanish Observatory for Dental Patient Safety (OESPO) after the study and analysis of 4,149 legal claims (both in and out of court) based on dental malpractice from the years of 2000 to 2010 in Spain. Results: Implant treatments, endodontics and oral surgery display the highest frequencies of adverse events in this series (25.5%, 20.7% and 20.4% respectively). Likewise, according to the results, up to 44.3% of the adverse events which took place were due to predictable and preventable errors and complications. Conclusions: A very significant percentage were due to foreseeable and preventable errors and complications that should not have occurred. Key words:Patient safety, adverse event, medical care risk, dentistry. PMID:24880444

  12. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  13. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    SciTech Connect

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.

  14. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  15. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  16. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  17. Analysis and modeling of a hail event consequences on a building portfolio

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  18. Stressful life events during adolescence and risk for externalizing and internalizing psychopathology: a meta-analysis.

    PubMed

    March-Llanes, Jaume; Marqués-Feixa, Laia; Mezquita, Laura; Fañanás, Lourdes; Moya-Higueras, Jorge

    2017-05-13

    The main objective of the present research was to analyze the relations between stressful life events and the externalizing and internalizing spectra of psychopathology using meta-analytical procedures. After removing the duplicates, a total of 373 papers were found in a literature search using several bibliographic databases, such as the PsycINFO, Medline, Scopus, and Web of Science. Twenty-seven studies were selected for the meta-analytical analysis after applying different inclusion and exclusion criteria in different phases. The statistical procedure was performed using a random/mixed-effects model based on the correlations found in the studies. Significant positive correlations were found in cross-sectional and longitudinal studies. A transactional effect was then found in the present study. Stressful life events could be a cause, but also a consequence, of psychopathological spectra. The level of controllability of the life events did not affect the results. Special attention should be given to the usage of stressful life events in gene-environment interaction and correlation studies, and also for clinical purposes.

  19. Identification and analysis of alternative splicing events conserved in human and mouse

    PubMed Central

    Yeo, Gene W.; Van Nostrand, Eric; Holste, Dirk; Poggio, Tomaso; Burge, Christopher B.

    2005-01-01

    Alternative pre-mRNA splicing affects a majority of human genes and plays important roles in development and disease. Alternative splicing (AS) events conserved since the divergence of human and mouse are likely of primary biological importance, but relatively few of such events are known. Here we describe sequence features that distinguish exons subject to evolutionarily conserved AS, which we call alternative conserved exons (ACEs), from other orthologous human/mouse exons and integrate these features into an exon classification algorithm, acescan. Genome-wide analysis of annotated orthologous human–mouse exon pairs identified ≈2,000 predicted ACEs. Alternative splicing was verified in both human and mouse tissues by using an RT-PCR-sequencing protocol for 21 of 30 (70%) predicted ACEs tested, supporting the validity of a majority of acescan predictions. By contrast, AS was observed in mouse tissues for only 2 of 15 (13%) tested exons that had EST or cDNA evidence of AS in human but were not predicted ACEs, and AS was never observed for 11 negative control exons in human or mouse tissues. Predicted ACEs were much more likely to preserve the reading frame and less likely to disrupt protein domains than other AS events and were enriched in genes expressed in the brain and in genes involved in transcriptional regulation, RNA processing, and development. Our results also imply that the vast majority of AS events represented in the human EST database are not conserved in mouse. PMID:15708978

  20. Analysis on proton fluxes during several solar events with the PAMELA experiment

    NASA Astrophysics Data System (ADS)

    Martucci, Matteo

    2015-04-01

    The charged particle production during solar events have been widely modelized in the past decades. The satellite-borne PAMELA experiment has been continuously collecting data since 2006. This apparatus is designed to study charged particles in the cosmic radiation. The combination of permanent magnet, silicon strip spectrometer and silicon-tungsten imaging calorimeter, with the redundancy of instrumentation allows very precise studies on the physics of cosmic rays in a wide energy range and with high statistics. This makes PAMELA a very suitable instrument for Solar Energetic Particle (SEP) observations. Not only does it span the energy range between the ground-based neutron monitor data and the observations of SEPs from space, but also PAMELA carries out the first direct measurements of the composition for the highest energy SEP events. PAMELA has registered many SEP events in solar cycle 24, offering unique opportunities to address the question of high-energy SEP origin. A preliminar analysis on proton spectra during several events of the 24th solar cycle is presented.

  1. Grieving experiences amongst adolescents orphaned by AIDS: Analysis from event history calendars.

    PubMed

    Thupayagale-Tshweneagae, Gloria

    2012-09-07

    Mental health is an essential component of adolescent health and wellbeing. Mental health practitioners assess adolescents' mental health status to identify possible issues that may lead to mental health problems. However, very few of the tools used to assess the mental health status of adolescents include assessment for grieving and coping patterns. The current tools used for assessing an individual's mental health are lengthy and not comprehensive. The purpose of this study was to assess grieving patterns of adolescents orphaned by AIDS and to appraise the usefulness of an event history calendar as an assessment tool for identifying grieving experiences, in order to guide and support these adolescents through the grieving process. One hundred and two adolescents aged 14-18 years, who had been orphaned by AIDS, completed an event history calendar, reviewed it with the researcher and reported their perceptions of it. Thematic analysis of the event history calendar content revealed that it is an effective, time-efficient, adolescent-friendly tool that facilitated identification and discussion of the orphaned adolescents' grieving patterns. Crying, isolation, silence and violent outbursts were the main grieving patterns reported by adolescents orphaned by AIDS. The researcher recommends use of the event history calendar for identification of orphaned adolescents' grieving experiences. Early identification would enable mental health practitioners to support them in order to prevent the occurrence of mental illness due to maladaptive grieving.

  2. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data.

    PubMed

    Saramago, Pedro; Chuang, Ling-Hsiang; Soares, Marta O

    2014-09-10

    Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest.

  3. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  4. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    PubMed Central

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  5. Multi-spacecraft solar energetic particle analysis of FERMI gamma-ray flare events within the HESPERIA H2020 project

    NASA Astrophysics Data System (ADS)

    Tziotziou, Kostas; Malandraki, Olga; Valtonen, Eino; Heber, Bernd; Zucca, Pietro; Klein, Karl-Ludwig; Vainio, Rami; Tsiropoula, Georgia; Share, Gerald

    2017-04-01

    Multi-spacecraft observations of solar energetic particle (SEP) events are important for understanding the acceleration processes and the interplanetary propagation of particles released during eruptive events. In this work, we have carefully studied 25 gamma-ray flare events observed by FERMI and investigated possible associations with SEP-related events observed with STEREO and L1 spacecraft in the heliosphere. A data-driven velocity dispersion analysis (VDA) and Time-Shifting Analysis (TSA) are used for deriving the release times of protons and electrons at the Sun and for comparing them with the respective times stemming from the gamma-ray event analysis and their X-ray signatures, in an attempt to interconnect the SEPs and Fermi events and better understand the physics involved. Acknowledgements: This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  6. The impact of dropouts on the analysis of dose-finding studies with recurrent event data.

    PubMed

    Akacha, Mouna; Benda, Norbert

    2010-07-10

    This work is motivated by dose-finding studies, where the number of events per subject within a specified study period form the primary outcome. The aim of the considered studies is to identify the target dose for which the new drug can be shown to be as effective as a competitor medication. Given a pain-related outcome, we expect a considerable number of patients to drop out before the end of the study period. The impact of missingness on the analysis and models for the missingness process must be carefully considered.The recurrent events are modeled as over-dispersed Poisson process data, with dose as the regressor. Additional covariates may be included. Constant and time-varying rate functions are examined. Based on these models, the impact of missingness on the precision of the target dose estimation is evaluated. Diverse models for the missingness process are considered, including dependence on covariates and number of events. The performances of five different analysis methods are assessed via simulations: a complete case analysis; two analyses using different single imputation techniques; a direct-likelihood analysis and an analysis using pattern-mixture models.The target dose estimation is robust if the same missingness process holds for the target dose group and the active control group. Furthermore, we demonstrate that this robustness is lost as soon as the missingness mechanisms for the active control and the target dose differ. Of the methods explored, the direct-likelihood approach performs best, even when a missing not at random mechanism holds. Copyright 2010 John Wiley & Sons, Ltd.

  7. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    PubMed

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus.

  8. Analysis of Loss-of-Offsite-Power Events 1997-2015

    SciTech Connect

    Johnson, Nancy Ellen; Schroeder, John Alton

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  9. Application of satellite remote-sensing data for source analysis of fine particulate matter transport events.

    PubMed

    Engel-Cox, Jill A; Young, Gregory S; Hoff, Raymond M

    2005-09-01

    Satellite sensors have provided new datasets for monitoring regional and urban air quality. Satellite sensors provide comprehensive geospatial information on air quality with both qualitative imagery and quantitative data, such as aerosol optical depth. Yet there has been limited application of these new datasets in the study of air pollutant sources relevant to public policy. One promising approach to more directly link satellite sensor data to air quality policy is to integrate satellite sensor data with air quality parameters and models. This paper presents a visualization technique to integrate satellite sensor data, ground-based data, and back trajectory analysis relevant to a new rule concerning the transport of particulate matter across state boundaries. Overlaying satellite aerosol optical depth data and back trajectories in the days leading up to a known fine particulate matter with an aerodynamic diameter of <2.5 microm (PM2.5) event may indicate whether transport or local sources appear to be most responsible for high PM2.5 levels in a certain location at a certain time. Events in five cities in the United States are presented as case studies. This type of analysis can be used to help understand the source locations of pollutants during specific events and to support regulatory compliance decisions in cases of long distance transport.

  10. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  11. Image Analysis Algorithms for Immunohistochemical Assessment of Cell Death Events and Fibrosis in Tissue Sections

    PubMed Central

    Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan

    2009-01-01

    Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554

  12. Ensemble-based analysis of extreme precipitation events from 2007-2011

    NASA Astrophysics Data System (ADS)

    Lynch, Samantha

    From 2007 to 2011, 22 widespread, multiday rain events occurred across the United States. This study makes use of the European Centre for Medium-Range Weather Forecasts (ECMWF), the National Centers of Environmental Prediction (NCEP), and the United Kingdom Office of Meteorology (UKMET) ensemble prediction systems (EPS) in order to assess their forecast skill of these 22 widespread, precipitation events. Overall, the ECMWF had a skillful forecast for almost every event, with an exception of the 25-30 June 2007 event, the mesoscale convective vortex (MCV) over the southern plains of the United States. Additionally, the ECMWF EPS generally outperformed both the NCEP and UKMET EPS. To better evaluate the ECMWF, two widespread, multiday precipitation events were selected for closer examination: 29 April-4 May 2010 and 23-28 April 2011. The 29 April-4 May 2010 case study used ECMWF ensemble forecasts to explore the processes responsible for the development and maintenance of a multiday precipitation event that occurred in early May 2010, due to two successive quasi-stationary mesoscale convective systems. Locations in central Tennessee accumulated more than 483 millimeters (19 inches) of rain and the city of Nashville experienced a historic flash flood. Differences between ensemble members that correctly predicted heavy precipitation and those that did not were determined in order to determine the processes that were favorable or detrimental to the system's development. Statistical analysis was used to determine how synoptic-scale flows were correlated to area- averaged precipitation. For this particular case, the distribution of precipitation was found to be closely related to the strength of an upper-level trough in the central United States and an associated surface cyclone, with a weaker trough and cyclone being associated with more precipitation in the area of interest. The 23-28 April 2011 case study also used ECMWF ensemble forecasts to explore the processes

  13. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    NASA Astrophysics Data System (ADS)

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  14. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  15. The May 17, 2012 Solar Event: Back-Tracing Analysis and Flux Reconstruction with PAMELA

    NASA Technical Reports Server (NTRS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, Eric R.

    2016-01-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  16. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  17. Video analysis of dust events in full-tungsten ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Brochard, F.; Shalpegin, A.; Bardin, S.; Lunt, T.; Rohde, V.; Briançon, J. L.; Pautasso, G.; Vorpahl, C.; Neu, R.; The ASDEX Upgrade Team

    2017-03-01

    Fast video data recorded during seven consecutive operation campaigns (2008-2012) in full-tungsten ASDEX Upgrade have been analyzed with an algorithm developed to automatically detect and track dust particles. A total of 2425 discharges have been analyzed, corresponding to 12 204 s of plasma operation. The analysis aimed at precisely identifying and sorting the discharge conditions responsible of the dust generation or remobilization. Dust rates are found to be significantly lower than in tokamaks with carbon PFCs. Significant dust events occur mostly during off-normal plasma phases such as disruptions and particularly those preceded by vertical displacement events (VDEs). Dust rates are also increased but to a lower extent during type-I ELMy H-modes. The influences of disruption energy, heating scenario, vessel venting and vessel vibrations are also presented.

  18. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  19. Forward flux sampling-type schemes for simulating rare events: efficiency analysis.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-05-21

    We analyze the efficiency of several simulation methods which we have recently proposed for calculating rate constants for rare events in stochastic dynamical systems in or out of equilibrium. We derive analytical expressions for the computational cost of using these methods and for the statistical error in the final estimate of the rate constant for a given computational cost. These expressions can be used to determine which method to use for a given problem, to optimize the choice of parameters, and to evaluate the significance of the results obtained. We apply the expressions to the two-dimensional nonequilibrium rare event problem proposed by Maier and Stein [Phys. Rev. E 48, 931 (1993)]. For this problem, our analysis gives accurate quantitative predictions for the computational efficiency of the three methods.

  20. The May 17, 2012 Solar Event: Back-Tracing Analysis and Flux Reconstruction with PAMELA

    NASA Technical Reports Server (NTRS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; hide

    2016-01-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  1. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables

    PubMed Central

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an “event of relation” with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  2. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables.

    PubMed

    Yin, Yu; Yao, Dezhong

    2016-07-08

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an "event of relation" with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals.

  3. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    NASA Astrophysics Data System (ADS)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  4. Systematic identification and analysis of frequent gene fusion events in metabolic pathways

    DOE PAGES

    Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.; ...

    2016-06-24

    Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less

  5. Systematic identification and analysis of frequent gene fusion events in metabolic pathways

    SciTech Connect

    Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.; Mullen, Jeffrey D.; Colasanti, Ric; Zhukov, Aleksey; Frelin, Oceane; Thiaville, Jennifer J.; Zallot, Remi; Niehaus, Thomas D.; Hasnain, Ghulam; Conrad, Neal; Hanson, Andrew D.; de Crecy-Lagard, Valerie

    2016-06-24

    Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These sets were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.

  6. Seamless Level 2/Level 3 probabilistic risk assessment using dynamic event tree analysis

    NASA Astrophysics Data System (ADS)

    Osborn, Douglas Matthew

    The current approach to Level 2 and Level 3 probabilistic risk assessment (PRA) using the conventional event-tree/fault-tree methodology requires pre-specification of event order occurrence which may vary significantly in the presence of uncertainties. Manual preparation of input data to evaluate the possible scenarios arising from these uncertainties may also lead to errors from faulty/incomplete input preparation and their execution using serial runs may lead to computational challenges. A methodology has been developed for Level 2 analysis using dynamic event trees (DETs) that removes these limitations with systematic and mechanized quantification of the impact of aleatory uncertainties on possible consequences and their likelihoods. The methodology is implemented using the Analysis of Dynamic Accident Progression Trees (ADAPT) software. For the purposes of this work, aleatory uncertainties are defined as those arising from the stochastic nature of the processes under consideration, such as the variability of weather, in which the probability of weather patterns is predictable but the conditions at the time of the accident are a matter of chance. Epistemic uncertainties are regarded as those arising from the uncertainty in the model (system code) input parameters (e.g., friction or heat transfer correlation parameters). This work conducts a seamless Level 2/3 PRA using a DET analysis. The research helps to quantify and potentially reduce the magnitude of the source term uncertainty currently experienced in Level 3 PRA. Current techniques have been demonstrated with aleatory uncertainties for environmental releases of radioactive materials. This research incorporates epistemic and aleatory uncertainties in a phenomenologically consistent manner through use of DETs. The DETs were determined using the ADAPT framework and linking ADAPT with MELCOR, MELMACCS, and the MELCOR Accident Consequence Code System, Version 2. Aleatory and epistemic uncertainties incorporated

  7. Uncertainty Analysis of Climate Change Impact on Extreme Rainfall Events in the Apalachicola River Basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, D.; Hagen, S.; Bacopoulos, P.

    2011-12-01

    Climate change impact on the rainfall patterns during the summer season (May -- August) at the Apalachicola River basin (Florida Panhandle coast) is assessed using ensemble regional climate models (RCMs). Rainfall data for both baseline and future years (30-year periods) are obtained from North American Regional Climate Change Assessment Program (NARCCAP) where the A2 emission scenario is used. Trend analysis is conducted based on historical rainfall data from three weather stations. Two methods are used to assess the climate change impact on the rainfall intensity-duration-frequency (IDF) curves, i.e., maximum intensity percentile-based method and sequential bias correction and maximum intensity percentile-based method. As a preliminary result from one RCM, extreme rainfall intensity is found to increase significantly with the increase in rainfall intensity increasing more dramatically with closer proximity to the coast. The projected rainfall pattern changes (spatial and temporal, mean and extreme values) provide guidance for developing adaptation and mitigation strategies on water resources management and ecosystem protections. More rainfall events move from July to June during future years for all three stations; in the upstream, the variability of time occurrence of extreme rainfall increases and more extreme events are shown to occur in June and August instead of May. These temporal shifts of extreme rainfall events will increase the probability of simultaneous heavy rainfall in the downstream and upstream in June during which flooding will be enhanced. The uncertainty analysis on the climate change impact on extreme rainfall events will be presented based on the simulations from the ensemble of RCMs.

  8. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    SciTech Connect

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. )

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  9. Meta-Analysis of Relation of Vital Exhaustion to Cardiovascular Disease Events.

    PubMed

    Cohen, Randy; Bavishi, Chirag; Haider, Syed; Thankachen, Jincy; Rozanski, Alan

    2017-04-15

    To assess the net impact of vital exhaustion on cardiovascular events and all-cause mortality, we conducted a systematic search of PubMed, EMBASE, and PsychINFO (through April 2016) to identify all studies which investigated the relation between vital exhaustion (VE) and health outcomes. Inclusion criteria were as follows: (1) a cohort study (prospective cohort or historical cohort) consisting of adults (>18 years); (2) at least 1 self-reported or interview-based assessment of VE or exhaustion; (3) evaluated the association between vital exhaustion or exhaustion and relevant outcomes; and (4) reported adjusted risk estimates of vital exhaustion/exhaustion for outcomes. Maximally adjusted effect estimates with 95% CIs along with variables used for adjustment in multivariate analysis were also abstracted. Primary study outcome was cardiovascular events. Secondary outcomes were stroke and all-cause mortality. Seventeen studies (19 comparisons) with a total of 107,175 participants were included in the analysis. Mean follow-up was 6 years. VE was significantly associated with an increased risk for cardiovascular events (relative risk 1.53, 95% CI 1.28 to 1.83, p <0.001) and all-cause mortality (relative risk 1.48, 95% CI 1.28 to 1.72, p <0.001). VE also showed a trend for increased incident stroke (relative risk 1.46, 95% CI 0.97 to 2.21, p = 0.07). Subgroup analyses yielded similar results. VE is a significant risk factor for cardiovascular events, comparable in potency to common psychosocial risk factors. Our results imply a need to more closely study VE, and potentially related states of exhaustion, such as occupational burnout.

  10. Final Report for Dynamic Models for Causal Analysis of Panel Data. Approaches to the Censoring Problem in Analysis of Event Histories. Part III, Chapter 2.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…

  11. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  12. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  13. An event-related analysis of P300 by simultaneous EEG/fMRI

    NASA Astrophysics Data System (ADS)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  14. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    NASA Astrophysics Data System (ADS)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  15. Pinda: a web service for detection and analysis of intraspecies gene duplication events.

    PubMed

    Kontopoulos, Dimitrios-Georgios; Glykos, Nicholas M

    2013-09-01

    We present Pinda, a Web service for the detection and analysis of possible duplications of a given protein or DNA sequence within a source species. Pinda fully automates the whole gene duplication detection procedure, from performing the initial similarity searches, to generating the multiple sequence alignments and the corresponding phylogenetic trees, to bootstrapping the trees and producing a Z-score-based list of duplication candidates for the input sequence. Pinda has been cross-validated using an extensive set of known and bibliographically characterized duplication events. The service facilitates the automatic and dependable identification of gene duplication events, using some of the most successful bioinformatics software to perform an extensive analysis protocol. Pinda will prove of use for the analysis of newly discovered genes and proteins, thus also assisting the study of recently sequenced genomes. The service's location is http://orion.mbg.duth.gr/Pinda. The source code is freely available via https://github.com/dgkontopoulos/Pinda/. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  17. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    NASA Astrophysics Data System (ADS)

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  18. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model.

  19. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    PubMed

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  20. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    NASA Astrophysics Data System (ADS)

    Wouters, J.; Bouchet, F.

    2016-09-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.

  1. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event

    PubMed Central

    Bogaard, Glynis; Meijer, Ewout H.; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  2. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  3. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  4. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    SciTech Connect

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  5. Classification of persistent heavy rainfall events over South China and associated moisture source analysis

    NASA Astrophysics Data System (ADS)

    Liu, Ruixin; Sun, Jianhua; Wei, Jie; Fu, Shenming

    2016-08-01

    Persistent heavy rainfall events (PHREs) over South China during 1981-2014 were selected and classified by an objective method, based on the daily precipitation data at 752 stations in China. The circulation characteristics, as well as the dry-cold air and moisture sources of each type of PHREs were examined. The main results are as follows. A total of 32 non-typhoon influenced PHREs in South China were identified over the study period. By correlation analysis, the PHREs are divided into three types: SC-A type, with its main rainbelt located in the coastal areas and the northeast of Guangdong Province; SC-B type, with its main rainbelt between Guangdong Province and Guangxi Region; and SC-C type, with its main rainbelt located in the north of Guangxi Region. For the SC-A events, dry-cold air flew to South China under the steering effect of troughs in the middle troposphere which originated from the Ural Mountains and West Siberia Plain; whereas, the SC-C events were not influenced by the cold air from high latitudes. There were three water vapor pathways from low-latitude areas for both the SC-A and SC-C PHREs. The tropical Indian Ocean was the main water vapor source for these two PHRE types, while the South China Sea also contributed to the SC-C PHREs. In addition, the SC-A events were also influenced by moist and cold air originating from the Yellow Sea. Generally, the SC-C PHREs belonged to a warm-sector rainfall type, whose precipitation areas were dominated by southwesterly wind, and the convergence in wind speed was the main reason for precipitation.

  6. Single event time-series analysis in a karst catchment evaluated using a groundwater model

    NASA Astrophysics Data System (ADS)

    Mayaud, Cyril; Wagner, Thomas; Birk, Steffen

    2013-04-01

    The Lurbach-Tanneben karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massive itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 (Oswald et al., EGU2009-9255) demonstrates that an overflow from one of the sub-catchment to the other is activated if the spring discharge exceeds a threshold. Time-series analysis (e.g., auto-correlation, cross-correlation) was applied to examine how far the various available methods support the identification of the transient inter-catchment flow observed in this karst system. As inter-catchment flow is intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time-series analysis a simplified groundwater flow model was built using MODFLOW based on the current conceptual understanding of the karst system. The groundwater model represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various options of recharge (e.g., allogenic versus autogenic) were used to generate synthetic discharge data for the time-series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of recharge and aquifer properties in the results from the time-series analysis. Comparing the results from the time-series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. Thus, the heterogeneity of hydraulic aquifer

  7. Parallel Factor Analysis as an exploratory tool for wavelet transformed event-related EEG.

    PubMed

    Mørup, Morten; Hansen, Lars Kai; Herrmann, Christoph S; Parnas, Josef; Arnfred, Sidse M

    2006-02-01

    In the decomposition of multi-channel EEG signals, principal component analysis (PCA) and independent component analysis (ICA) have widely been used. However, as both methods are based on handling two-way data, i.e. two-dimensional matrices, multi-way methods might improve the interpretation of frequency transformed multi-channel EEG of channel x frequency x time data. The multi-way decomposition method Parallel Factor (PARAFAC), also named Canonical Decomposition (CANDECOMP), was recently used to decompose the wavelet transformed ongoing EEG of channel x frequency x time (Miwakeichi, F., Martinez-Montes, E., Valdes-Sosa, P.A., Nishiyama, N., Mizuhara, H., Yamaguchi, Y., 2004. Decomposing EEG data into space-time-frequency components using parallel factor analysis. Neuroimage 22, 1035-1045). In this article, PARAFAC is used for the first time to decompose wavelet transformed event-related EEG given by the inter-trial phase coherence (ITPC) encompassing ANOVA analysis of differences between conditions and 5-way analysis of channel x frequency x time x subject x condition. A flow chart is presented on how to perform data exploration using the PARAFAC decomposition on multi-way arrays. This includes (A) channel x frequency x time 3-way arrays of F test values from a repeated measures analysis of variance (ANOVA) between two stimulus conditions; (B) subject-specific 3-way analyses; and (C) an overall 5-way analysis of channel x frequency x time x subject x condition. The PARAFAC decompositions were able to extract the expected features of a previously reported ERP paradigm: namely, a quantitative difference of coherent occipital gamma activity between conditions of a visual paradigm. Furthermore, the method revealed a qualitative difference which has not previously been reported. The PARAFAC decomposition of the 3-way array of ANOVA F test values clearly showed the difference of regions of interest across modalities, while the 5-way analysis enabled visualization of

  8. Wood anatomical analysis of Alnus incana and Betula pendula injured by a debris-flow event.

    PubMed

    Arbellay, Estelle; Stoffel, Markus; Bollschweiler, Michelle

    2010-10-01

    Vessel chronologies in ring-porous species have been successfully employed in the past to extract the climate signal from tree rings. Environmental signals recorded in vessels of ring-porous species have also been used in previous studies to reconstruct discrete events of drought, flooding and insect defoliation. However, very little is known about the ability of diffuse-porous species to record environmental signals in their xylem cells. Moreover, time series of wood anatomical features have only rarely been used to reconstruct former geomorphic events. This study was therefore undertaken to characterize the wood anatomical response of diffuse-porous Alnus incana (L.) Moench and Betula pendula Roth to debris-flow-induced wounding. Tree microscopic response to wounding was assessed through the analysis of wood anatomical differences between injured rings formed in the debris-flow event year and uninjured rings formed in the previous year. The two ring types were examined close and opposite to the injury in order to determine whether wound effects on xylem cells decrease with increasing tangential distance from the injury. Image analysis was used to measure vessel parameters as well as fiber and parenchyma cell (FPC) parameters. The results of this study indicate that injured rings are characterized by smaller vessels as compared with uninjured rings. By contrast, FPC parameters were not found to significantly differ between injured and uninjured rings. Vessel and FPC parameters mainly remained constant with increasing tangential distance from the injury, except for a higher proportion of vessel lumen area opposite to the injury within A. incana. This study highlights the existence of anatomical tree-ring signatures-in the form of smaller vessels-related to past debris-flow activity and addresses a new methodological approach to date injuries inflicted on trees by geomorphic processes.

  9. Evaluation of automated streamwater sampling during storm events for total mercury analysis.

    PubMed

    Riscassi, Ami L; Converse, Amber D; Hokanson, Kelly J; Scanlon, Todd M

    2010-10-06

    Understanding the processes by which mercury is mobilized from soil to stream is currently limited by a lack of observations during high-flow events, when the majority of this transport occurs. An automated technique to collect stream water for unfiltered total mercury (HgT) analysis was systematically evaluated in a series of laboratory experiments. Potential sources of error investigated were 1) carry-over effects associated with sequential sampling, 2) deposition of HgT into empty bottles prior to sampling, and 3) deposition to or evasion from samples prior to retrieval. Contamination from carry-over effects was minimal (<2%) and HgT deposition to open bottles was negligible. Potentially greater errors are associated with evasive losses of HgT from uncapped samples, with higher temperatures leading to greater evasion. These evasive losses were found to take place primarily within the first eight hours. HgT associated with particulate material is much less prone to evasion than HgT in dissolved form. A field test conducted during a high-flow event confirmed unfiltered HgT concentrations sampled with an automated system were comparable to those taken manually, as the mean absolute difference between automated and manual samples (10%) was similar to the mean difference between duplicate grab samples (9%). Results from this study have demonstrated that a standard automated sampler, retrofitted with appropriately cleaned fluoropolymer tubing and glass bottles, can effectively be used for collection of streamwater during high-flow events for low-level mercury analysis.

  10. Developing indicators of inpatient adverse drug events through nonlinear analysis using administrative data.

    PubMed

    Nebeker, Jonathan R; Yarnold, Paul R; Soltysik, Robert C; Sauer, Brian C; Sims, Shannon A; Samore, Matthew H; Rupper, Randall W; Swanson, Kathleen M; Savitz, Lucy A; Shinogle, Judith; Xu, Wu

    2007-10-01

    Because of uniform availability, hospital administrative data are appealing for surveillance of adverse drug events (ADEs). Expert-generated surveillance rules that rely on the presence of International Classification of Diseases, 9th Revision Clinical Modification (ICD-9-CM) codes have limited accuracy. Rules based on nonlinear associations among all types of available administrative data may be more accurate. By applying hierarchically optimal classification tree analysis (HOCTA) to administrative data, derive and validate surveillance rules for bleeding/anticoagulation problems and delirium/psychosis. Retrospective cohort design. A random sample of 3987 admissions drawn from all 41 Utah acute-care hospitals in 2001 and 2003. Professional nurse reviewers identified ADEs using implicit chart review. Pharmacists assigned Medical Dictionary for Regulatory Activities codes to ADE descriptions for identification of clinical groups of events. Hospitals provided patient demographic, admission, and ICD9-CM data. Incidence proportions were 0.8% for drug-induced bleeding/anticoagulation problems and 1.0% for drug-induced delirium/psychosis. The model for bleeding had very good discrimination and sensitivity at 0.87 and 86% and fair positive predictive value (PPV) at 12%. The model for delirium had excellent sensitivity at 94%, good discrimination at 0.83, but low PPV at 3%. Poisoning and adverse event codes designed for the targeted ADEs had low sensitivities and, when forced in, degraded model accuracy. Hierarchically optimal classification tree analysis is a promising method for rapidly developing clinically meaningful surveillance rules for administrative data. The resultant model for drug-induced bleeding and anticoagulation problems may be useful for retrospective ADE screening and rate estimation.

  11. Reconstructing the 1771 Great Yaeyama Tsunami Event by using Impact Intensity Analysis and Volume Flux Method

    NASA Astrophysics Data System (ADS)

    Wu, Han; Wu, Tso-Ren; Lee, Chun-Juei; Tsai, Yu-Lin; Li, Pei-Yu

    2017-04-01

    The event of 1771 Japan Ishigaki Earthquake induced a large tsunami with an 80-meter runup height recorded. Several reef boulders transported by the huge tsunami waves were found along the coast and were located at elevation about 30 meters. Considering the limited distance between Yaeyama and Taiwan Islands, this study aimed to understand the behavior of tsunami propagation and the potential hazard in Taiwan. Reconstructing the 1771 event and validating the result with the field survey is the first step. In order to analysis hazard from the potential tsunami sources around the event area, we adopted the Impact Intensity Analysis (IIA), which had been presented in the EGU 2016 and many other international conferences. Instead of using IIA method, we further developed a new method called the Volume Flux Method (VFM). The VFM kept the accuracy of IIA method. However, the efficiency was improved significantly. The analyzed results showed that the source of the 1771 Great Yaeyama Tsunami was most likely located at the south offshore of Ishigaki Island. The wave height and inundation area were matched with the survey map (Geospatial Information Authority of Japan, 1994). The tsunami threat to Taiwan was also simulated. It indicated that the tsunami height would not be greater than 1 meters at east coast of Taiwan if the tsunami source located at nearshore around Ishigaki Island. However, it is noteworthy that the northeast coast of Taiwan was under the tsunami threats if the sources located in the south offshore on the Ryukyu Trench. We will present the detailed result in EGU 2017.

  12. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    NASA Astrophysics Data System (ADS)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  13. Analysis of Loss-of-Offsite-Power Events 1998–2013

    SciTech Connect

    Schroeder, John Alton

    2015-02-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses suggest that loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2013. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The emergency diesel generator failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. No statistically significant trends in LOOP frequencies over the 1997–2013 period are identified. There is a possibility that a significant trend in grid-related LOOP frequency exists that is not easily detected by a simple analysis. Statistically significant increases in recovery times after grid- and switchyard-related LOOPs are identified.

  14. Replica analysis of overfitting in regression models for time-to-event data

    NASA Astrophysics Data System (ADS)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  15. Meta-analysis of safety for low event-rate binomial trials.

    PubMed

    Shuster, Jonathan J; Guo, Jennifer D; Skyler, Jay S

    2012-03-01

    This article focuses on meta-analysis of low event-rate binomial trials. We introduce two forms of random effects: (1) 'studies at random' (SR), where we assume no more than independence between studies; and (2) 'effects at random' (ER), which forces the effect size distribution to be independent of the study design. On the basis of the summary estimates of proportions, we present both unweighted and study-size weighted methods, which, under SR, target different population parameters. We demonstrate mechanistically that the popular DerSimonian-Laird (DL) method, as DL actually warned in their paper, should never be used in this setting. We conducted a survey of the major cardiovascular literature on low event-rate studies and found that DL using odds ratios or relative risks to be the clear method of choice. We looked at two high profile examples from diabetes and cancer, respectively, where the choice of weighted versus unweighted methods makes a large difference. A large simulation study supports the accuracy of the coverage of our approximate confidence intervals. We recommend that before looking at their data, users should prespecify which target parameter they intend to estimate (weighted vs. unweighted) but estimate the other as a secondary analysis. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Meta-analysis of safety for low event-rate binomial trials

    PubMed Central

    Shuster, Jonathan J; Guo, Jennifer D.; Skyler, Jay S.

    2013-01-01

    SUMMARY This article focuses on meta-analysis of low event-rate binomial trials. We introduce two forms of random effects: (1) “Studies at Random” (SR), where we assume no more than independence between studies; and (2) “Effects at random” (ER), which forces the effect size distribution to be independent of the study design. Based upon summary estimates of proportions, we present both unweighted and study size weighted methods, which under SR target different population parameters. We demonstrate mechanistically that the popular DerSimonian-Laird (DL) method, as DL actually warned in their paper, should never be used in this setting. We conducted a survey of the major cardiovascular literature on low event rate studies and found that DL using odds ratios or relative risks to be the clear method of choice. We looked at two high profile examples from diabetes and cancer, respectively, where the choice of weighted vs. unweighted methods makes a large difference. A large simulation study supports the accuracy of coverage of our approximate confidence intervals. We recommend that before looking at their data, users should pre-specify which target parameter they intend to estimate (weighted vs. unweighted), but estimate the other as a secondary analysis. PMID:24339834

  17. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  18. Considering historical flood events in flood frequency analysis: Is it worth the effort?

    NASA Astrophysics Data System (ADS)

    Schendel, Thomas; Thongwichian, Rossukon

    2017-07-01

    Information about historical floods can be useful in reducing uncertainties in flood frequency estimation. Since the start of the historical record is often defined by the first known flood, the length of the true historical period M remains unknown. We have expanded a previously published method of estimating M to the case of several known floods within the historical period. We performed a systematic evaluation of the usefulness of including historical flood events into flood frequency analysis for a wide range of return periods and studied bias as well as relative root mean square error (RRMSE). Since we used the generalized extreme value distribution (GEV) as parent distribution, we were able to investigate the impact of varying the skewness on RRMSE. We confirmed the usefulness of historical flood data regarding the reduction of RRMSE, however we found that this reduction is less pronounced the more positively skewed the parent distribution was. Including historical flood information had an ambiguous effect on bias: depending on length and number of known floods of the historical period, bias was reduced for large return periods, but increased for smaller ones. Finally, we customized the test inversion bootstrap for estimating confidence intervals to the case that historical flood events are taken into account into flood frequency analysis.

  19. Short Term Forecasts of Volcanic Activity Using An Event Tree Analysis System and Logistic Regression

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Jones, W. L.; Woods, M. T.

    2011-12-01

    An automated event tree analysis system for estimating the probability of short term volcanic activity is presented. The algorithm is driven by a suite of empirical statistical models that are derived through logistic regression. Each model is constructed from a multidisciplinary dataset that was assembled from a collection of historic volcanic unrest episodes. The dataset consists of monitoring measurements (e.g. InSAR, seismic), source modeling results, and historic eruption activity. This provides a simple mechanism for simultaneously accounting for the geophysical changes occurring within the volcano and the historic behavior of analog volcanoes. The algorithm is extensible and can be easily recalibrated to include new or additional monitoring, modeling, or historic information. Standard cross validation techniques are employed to optimize its forecasting capabilities. Analysis results from several recent volcanic unrest episodes are presented.

  20. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    NASA Astrophysics Data System (ADS)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  1. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  2. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  3. Spousal Communication and Contraceptive Use in Rural Nepal: An Event History Analysis

    PubMed Central

    Link, Cynthia F.

    2012-01-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication–contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives’ and husbands’ perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives’ reports of communication are substantial explanatory factors in couples’ initiation of all contraceptive methods examined. Husbands’ reports of communication predict couples’ subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics—as well as husbands’ perceptions of these dynamics—influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs. PMID:21834410

  4. Spousal communication and contraceptive use in rural Nepal: an event history analysis.

    PubMed

    Link, Cynthia F

    2011-06-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication-contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives' and husbands' perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives' reports of communication are substantial explanatory factors in couples' initiation of all contraceptive methods examined. Husbands' reports of communication predict couples'subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics--as well as husbands' perceptions of these dynamics--influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs.

  5. Modeling propensity to move after job change using event history analysis and temporal GIS

    NASA Astrophysics Data System (ADS)

    Vandersmissen, Marie-Hélène; Séguin, Anne-Marie; Thériault, Marius; Claramunt, Christophe

    2009-03-01

    The research presented in this paper analyzes the emergent residential behaviors of individual actors in a context of profound social changes in the work sphere. It incorporates a long-term view in the analysis of the relationships between social changes in the work sphere and these behaviors. The general hypothesis is that social changes produce complex changes in the long-term dynamics of residential location behavior. More precisely, the objective of this paper is to estimate the propensity for professional workers to move house after a change of workplace. Our analysis draws on data from a biographical survey using a retrospective questionnaire that enables a posteriori reconstitution of the familial, professional and residential lifelines of professional workers since their departure from their parents’ home. The survey was conducted in 1996 in the Quebec City Metropolitan Area, which, much like other Canadian cities, has experienced a substantial increase in “unstable” work, even for professionals. The approach is based on event history analysis, a Temporal Geographic Information System and exploratory spatial analysis of model’s residuals. Results indicate that 48.9% of respondents moved after a job change and that the most important factors influencing the propensity to move house after a job change are home tenure (for lone adults as for couple) and number of children (for couples only). We also found that moving is associated with changing neighborhood for owners while tenants or co-tenants tend to stay in the same neighborhood. The probability of moving 1 year after a job change is 0.10 for lone adults and couples while after 2 years, the household structure seems to have an impact: the probability increased to 0.23 for lone adults and to 0.21 for couples. The outcome of this research contributes to furthering our understanding of a familial decision (to move) following a professional event (change of job), controlling for household structure

  6. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    NASA Astrophysics Data System (ADS)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  7. 76 FR 70768 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request...

  8. Frequency analysis of extreme events based on precipitation station data over southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Zilli, M. T.; Carvalho, L. V.

    2013-12-01

    The southeastern coast (SE) of Brazil is among the most densely populated areas of eastern South America with population largely concentrated in urban centers. Due to complex terrain and chaotic urbanization, this region is subject to a variety of natural disasters, including but not limited to floods and landslides that frequently occur during the austral summer (September to March). The South American Monsoon System (SAMS) and the South Atlantic Convergence zone (SACZ) are the most important climatic features that affect precipitation regimes in SE Brazil during the Austral summer. Previous studies have shown an overall increasing trend in daily precipitation and consequently in extreme events in SE Brazil. Future scenarios of climate change seem to indicate that SAMS daily precipitation will likely continue to increase throughout the 21st century. However, the rainfall response to the predicted warming is heterogeneous, and there is large uncertainty in the projected rainfall change and corresponding regional-to-local impacts. Some observational studies have demonstrated a positive trend in the frequency of extreme events in particular locations. Nevertheless, these analyses either focus on one single station or investigate relatively short periods. This study further investigates interannual to multiannual variations and changes in the frequency of daily extreme precipitation events in SE Brazil using long time series from a set of rain gauges stations. The analyzed rain gauge stations are located along the coastal area of SE Brazil (between 18°S and 25°S) and have at least 74 years of daily data, with less than 5% missing. The period of analysis varies slightly from station to station, but roughly all stations have data between 1930 and 2012. The analysis of the frequency of extreme events is based on the Peaks-over-Threshold (POT) approach, which follows a General Pareto Distribution (GPD), under the independent identically distributed assumption

  9. Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.

    2013-12-01

    Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web

  10. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  11. What can we learn from the deadly flash floods? Post Event Review Capability (PERC) analysis of the Bavaria and Baden-Wurttemberg flood events in Summer 2016

    NASA Astrophysics Data System (ADS)

    Szoenyi, Michael

    2017-04-01

    In May/June 2016, stationary low pressure systems brought intense rainfall with record-braking intensities of well above 100 mm rain in few hours locally in the southern states of Baden-Wurttemberg and Bavaria, Germany. In steep terrains, small channels and creeks became devastating torrents impacting, among others, the villages of Simbach/Inn, Schwäbisch-Gmünd and Braunsbach. Just few days prior, France had also seen devastating rainfall and flooding. Damage in Germany alone is estimated at 2.8 M USD, of which less than 50% are insured. The loss of life was significant, with 18 fatalities reported across the events. This new forensic event analysis as part of Zurich's Post Event Review Capability (PERC) investigates the flash flood events following these record rainfalls in Southern Germany and tries to answer the following questions holistically, across the five capitals (5C) and the full disaster risk management (DRM) cycle, which are key to understanding how to become more resilient to such flood events: - Why have these intense rainfall events led to such devastating consequences? The EU Floods directive and its implementation in the various member states, as well as the 2002 and 2013 Germany floods, have focused on larger rivers and the main asset concentration. The pathway and mechanism of the 2016 floods are very different and need to be better understood. Flash floods and surface flooding may need to become the new focus and be much better communicated to people at risk, as the awareness for such perils has been identified as low. - How can the prevalence for such flash floods be better identified and mapped? Research indicated that affected people and decision makers alike attribute the occurrence of such flash floods as arbitrary, but we argue that hotspots can and must be identified based on an overlay of rainfall intensity maps, topography leading to flash flood processes, and vulnerable assets. In Germany, there are currently no comprehensive hazard

  12. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports.

    PubMed

    Chanlekha, Hutchatai; Collier, Nigel

    2010-03-31

    Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall). We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  13. Reinvestigation and analysis a landslide dam event in 2012 using UAV

    NASA Astrophysics Data System (ADS)

    Wang, Kuo-Lung; Huang, Zji-Jie; Lin, Jun-Tin

    2015-04-01

    Geological condition of Taiwan is fracture with locating on Pacific Rim seismic area. Typhoons usually attack during summer and steep mountains are highly weathered, which induces landslide in mountain area. The situation happens more frequently recent years due to weather change effect. Most landslides are very far away from residence area. Field investigation is time consuming, high budget, limited data collected and dangerous. Investigation with satellite images has disadvantages such as less of the actual situation and poor resolution. Thus the possibility for slope investigation with UAV will be proposed and discussed in this research. Hazard investigation and monitoring is adopted UAV in recent years. UAV has advantages such as light weight, small volume, high mobility, safe, easy maintenance and low cost. Investigation can be executed in high risk area. Use the mature aero photogrammetry , combines aero photos with control point. Digital surface model (DSM) and Ortho photos can be produced with control points aligned. The resolution can be less than 5cm thus can be used as temporal creeping monitoring before landslide happens. A large landslide site at 75k of road No. 14 was investigated in this research. Landslide happened in June, 2012 with heavy rainfall and landslide dam was formed quickly after that. Analysis of this landslide failure and mechanism were discussed in this research using DEMs produced prior this event with aero photos and after this event with UAV. Residual slope stability analysis is thus carried out after strength parameters obtain from analysis described above. Thus advice for following potential landslide conditions can be provided.

  14. The Identification of Seismo and Volcanomagnetic Events Using Non-stationary Analysis of Geomagnetic Field Variations.

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Gonçalves, P.; Johnston, M.; La Manna, M.

    Many studies have shown a clear correlation between volcanic and/or seismic activ- ity and time variations of local geomagnetic fields, called seismomagnetic (SM) and /or volcanomagnetic (VM) effects. SM and VM can be generated from various phys- ical process, such as piezomagnetism, tectonomagnetism and electrokinetism. Rele- vant parameters are the event duration, the event magnitude and the magnetometer sample rate. Here, we present some results obtained from a non-stationary analysis of geomagnetic time series that focuses on automatic detection of possible SM and VM events. Several approaches are considered. The first one, based on the continuous wavelet transform, provides us with a multiresolution lecture of the signal, expanded in time-scale space. The second uses a time-variant adaptive algorithm (RLS) that al- lows the detection of some time intervals where important statistical variations of the signal occur. Finally, we investigate a third technique relying on multifractal analy- sis. This latter allows estimation of local regularity of a time series path, in order to detect unusual singularities. Different multifractal models were used for testing the methodology, such as multifractional Brownian Motions (mbmSs), before applying it to synthetic simulations of geomagnetic signals. In our simulations, we took into account theoretical SM and/or VM effects deriving from fault rupture and overpres- sured magma chambers. We applied these methodologies to two different real world data sets, recorded on Mt Etna (volcanic area) during the volcanic activity occurred in 1981, and in North Palm Springs (seismic area) during the seism of July 8th 1986, respectively. In both cases, all techniques were effective in automatically identifying the geomagnetic time-variations likely inferred by volcanic and/or seismic activity and the results are in good agreement with the indices provided by real volcanic and seismic measurements.

  15. The Identification of Seismo and Volcanomagnetic Events Using Non-stationary Analysis of Geomagnetic Field Variations.

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Gonçalves, P.; Johnston, M.; La Manna, M.

    Many studies have shown a clear correlation between volcanic and/or seismic activ- ity and time variations of local geomagnetic fields, called seismomagnetic (SM) and /or volcanomagnetic (VM) effects. SM and VM can be generated from various phys- ical process, such as piezomagnetism, tectonomagnetism and electrokinetism. Rele- vant parameters are the event duration, the event magnitude and the magnetometer sample rate. Here, we present some results obtained from a non-stationary analysis of geomagnetic time series that focuses on automatic detection of possible SM and VM events. Several approaches are considered. The first one, based on the continuous wavelet transform, provides us with a multiresolution lecture of the signal, expanded in time-scale space. The second uses a time-variant adaptive algorithm (RLS) that al- lows the detection of some time intervals where important statistical variations of the signal occur. Finally, we investigate a third technique relying on multifractal analy- sis. This latter allows estimation of local regularity of a time series path, in order to detect unusual singularities. Different multifractal models were used for testing the methodology, such as multifractional Brownian Motions (mbm 's), before applying it to synthetic simulations of geomagnetic signals. In our simulations, we took into account theoretical SM and/or VM effects deriving from fault rupture and overpres- sured magma chambers. We applied these methodologies to two different real world data sets, recorded on Mt Etna (volcanic area) during the volcanic activity occurred in 1981, and in North Palm Springs (seismic area) during the seism of July 8th 1986, respectively. In both cases, all techniques were effective in automatically identifying the geomagnetic time-variations likely inferred by volcanic and/or seismic activity in fact we obtained results in good agreement with the indices provided by real volcanic and seismic measurements.

  16. Analysis of a snowfall event produced by mountains waves in Guadarrama Mountains (Spain)

    NASA Astrophysics Data System (ADS)

    Gascón, Estíbaliz; Sánchez, José Luis; Fernández-González, Sergio; Merino, Andrés; López, Laura; García-Ortega, Eduardo

    2014-05-01

    Heavy snowfall events are fairly uncommon precipitation processes in the Iberian Peninsula. When large amounts of snow accumulate in large cities with populations that are unaccustomed to or unprepared for heavy snow, these events have a major impact on their daily activities. On 16 January 2013, an extreme snowstorm occurred in Guadarrama Mountains (Madrid, Spain) during an experimental winter campaign as a part of the TECOAGUA Project. Strong northwesterly winds, high precipitation and temperatures close to 0°C were detected throughout the whole day. During this episode, it was possible to continuously take measurements of different variables involved in the development of the convection using a multichannel microwave radiometer (MMWR). The significant increase in the cloud thickness observed vertically by the MMWR and registered precipitation of 43 mm in 24 hours at the station of Navacerrada (Madrid) led us to consider that we were facing an episode of strong winter convection. Images from the Meteosat Second Generation (MSG) satellite suggested that the main source of the convection was the formation of mountain waves on the south face of the Guadarrama Mountains. The event was simulated in high resolution using the WRF mesoscale model, an analysis of which is based on the observational simulations and data. Finally, the continuous measurements obtained with the MMWR allowed us to monitor the vertical situation above the Guadarrama Mountains with temporal resolution of 2 minutes. This instrument has a clear advantage in monitoring short-term episodes of this kind in comparison to radiosondes, which usually produce data at 0000 and 1200 UTC. Acknowledgements This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22). The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2.

  17. Social network changes and life events across the life span: a meta-analysis.

    PubMed

    Wrzus, Cornelia; Hänel, Martha; Wagner, Jenny; Neyer, Franz J

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network changes and the effects of life events on social networks using 277 studies with 177,635 participants from adolescence to old age. Cross-sectional as well as longitudinal studies consistently showed that (a) the global social network increased up until young adulthood and then decreased steadily, (b) both the personal network and the friendship network decreased throughout adulthood, (c) the family network was stable in size from adolescence to old age, and (d) other networks with coworkers or neighbors were important only in specific age ranges. Studies focusing on life events that occur at specific ages, such as transition to parenthood, job entry, or widowhood, demonstrated network changes similar to such age-related network changes. Moderator analyses detected that the type of network assessment affected the reported size of global, personal, and family networks. Period effects on network sizes occurred for personal and friendship networks, which have decreased in size over the last 35 years. Together the findings are consistent with the view that a portion of normative, age-related social network changes are due to normative, age-related life events. We discuss how these patterns of normative social network development inform research in social, evolutionary, cultural, and personality psychology.

  18. Multi-parameter Analysis and Visualization of Groundwater Quality during High River Discharge Events

    NASA Astrophysics Data System (ADS)

    Page, R. M.; Huggenberger, P.; Lischeid, G.

    2010-12-01

    The filter capacity of alluvial aquifers enables many groundwater extraction wells near rivers to provide high-quality drinking water during average flow and surface water quality conditions. However, during high river discharge events, the bacterial load of the groundwater is increased and the extracted water is no longer safe for the production of drinking water without treatment. Optimal management of production wells requires well-founded knowledge of the river - groundwater interaction and transport of microorganisms over this interface. Due to the spatial and temporal variability of river - groundwater interaction, monitoring individual parameters does not always correctly identify the actual potential risk of contamination of drinking water. Identifying situations where the quality is insufficient can be difficult in systems that are influenced by many factors including natural and artificial recharge, as well as extraction. As high-resolution sampling for waterborne pathogens during flood events is cost and time intensive, proxies are usually used in addition to short-term microbial monitoring studies. The resulting datasets are multi-dimensional and have variable temporal resolutions. For these reasons, it is necessary to apply procedures where multivariate datasets can be considered simultaneously and inherent patterns visualized. These patterns are important for determining the governing processes and can be used to assess the actual potential risk of contamination due to infiltrating surface water. In this study, a multi-parameter dataset, including specific conductivity and faecal indicators (Escherichia coli, enterococci and aerobic mesophilic germs), was analyzed using a combination of the Self-Organizing Map (SOM) and Sammon's mapping techniques. The SOM analysis allowed to differentiate between the effects of groundwater extraction and fluctuations of the river table on groundwater levels, electric conductivity and temperature in the well field

  19. Creating an infrastructure for safety event reporting and analysis in a multicenter pediatric emergency department network.

    PubMed

    Chamberlain, James M; Shaw, Kathy N; Lillis, Kathleen A; Mahajan, Prashant V; Ruddy, Richard M; Lichenstein, Richard; Olsen, Cody S; Dean, J Michael

    2013-02-01

    Hospital incident reporting is widely used but has had limited effectiveness for improving patient safety nationally. We describe the process of establishing a multi-institutional safety event reporting system. A descriptive study in The Pediatric Emergency Care Applied Research Network of 22 hospital emergency departments was performed. An extensive legal analysis addressed investigators' concerns about sharing confidential incident reports (IRs): (1) the ability to identify sites and (2) potential loss of peer review statute protection. Of the 22 Pediatric Emergency Care Applied Research Network sites, 19 received institutional approval to submit deidentified IRs to the data center. Incident reports were randomly assigned to independent review; discordance was resolved by consensus. Incident reports were categorized by type, subtype, severity, staff involved, and contributing factors. A total of 3,106 IRs were submitted by 18 sites in the first year. Reporting rates ranged more than 50-fold from 0.12 to 6.13 per 1000 patients. Data were sufficient to determine type of error (90% of IRs), severity (79%), staff involved (82%), and contributing factors (82%). However, contributing factors were clearly identified in only 44% of IRs and required extrapolation by investigators in 38%. The most common incidents were related to laboratory specimens (25.5%), medication administration (19.3%), and process variance, such as delays in care (14.4%). Incident reporting provides qualitative data concerning safety events. Perceived legal barriers to sharing confidential data can be addressed. Large variability in reporting rates and low rates of providing contributing factors suggest a need for standardization and improvement of safety event reporting.

  20. Radar analysis of the life cycle of Mesoscale Convective Systems during the 10 June 2000 event

    NASA Astrophysics Data System (ADS)

    Rigo, T.; Llasat, M. C.

    2005-12-01

    The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.

  1. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    NASA Astrophysics Data System (ADS)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  2. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  3. Robust analysis of event-related functional magnetic resonance imaging data using independent component analysis

    NASA Astrophysics Data System (ADS)

    Kadah, Yasser M.

    2002-04-01

    We propose a technique that enables robust use of blind source separation techniques in fMRI data analysis. The fMRI temporal signal is modeled as the summation of the true activation signal, a physiological baseline fluctuation component, and a random noise component. A preprocessing denoising is used to reduce the dimensionality of the random noise component in this mixture before applying the principal/independent component analysis (PCA/ICA) methods. The set of denoised time courses from a localized region are utilized to capture the region-specific activation patterns. We show a significant improvement in the convergence properties of the ICA iteration when the denoised time courses are used. We also demonstrate the advantage of using ICA over PCA to separate components due to physiological signals from those corresponding to actual activation. Moreover, we propose the use of ICA to analyze the magnitude of the Fourier domain of the time courses. This allows ICA to group signals with similar patterns and different delays together, which makes the iteration even more efficient. The proposed technique is verified using computer simulations as well as actual data from a healthy human volunteer. The results confirm the robustness of the new strategy and demonstrate its value for clinical use.

  4. Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis

    PubMed Central

    Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ

    2014-01-01

    In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880

  5. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    NASA Astrophysics Data System (ADS)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  6. Analysis of Loss-of-Offsite-Power Events 1998–2012

    SciTech Connect

    T. E. Wierman

    2013-10-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses performed loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience from fiscal year 1998 through 2012. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The EDG failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. A statistically significant increase in industry performance was identified for plant-centered and switchyard-centered LOOP frequencies. There is no statistically significant trend in LOOP durations.

  7. Kickoff to Conflict: A Sequence Analysis of Intra-State Conflict-Preceding Event Structures

    PubMed Central

    D'Orazio, Vito; Yonamine, James E.

    2015-01-01

    While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998–2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace. PMID:25951105

  8. OBSERVATIONS AND ANALYSIS OF MUTUAL EVENTS BETWEEN THE URANUS MAIN SATELLITES

    SciTech Connect

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; Da Silva Neto, D. N.; Andrei, A. H. E-mail: rvm@on.br

    2009-04-15

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s{sup -1}, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites.

  9. Observations and Analysis of Mutual Events between the Uranus Main Satellites

    NASA Astrophysics Data System (ADS)

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; da Silva Neto, D. N.; Andrei, A. H.

    2009-04-01

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s-1, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites. Based on observations made at Laboratório Nacional de Astrofísica (LNA), Itajubá-MG, Brazil.

  10. Single-event analysis of the packaging of bacteriophage T7 DNA concatemers in vitro.

    PubMed

    Sun, M; Louie, D; Serwer, P

    1999-09-01

    Bacteriophage T7 packages its double-stranded DNA genome in a preformed protein capsid (procapsid). The DNA substrate for packaging is a head-to-tail multimer (concatemer) of the mature 40-kilobase pair genome. Mature genomes are cleaved from the concatemer during packaging. In the present study, fluorescence microscopy is used to observe T7 concatemeric DNA packaging at the level of a single (microscopic) event. Metabolism-dependent cleavage to form several fragments is observed when T7 concatemers are incubated in an extract of T7-infected Escherichia coli (in vitro). The following observations indicate that the fragment-producing metabolic event is DNA packaging: 1) most fragments have the hydrodynamic radius (R(H)) of bacteriophage particles (+/-3%) when R(H) is determined by analysis of Brownian motion; 2) the fragments also have the fluorescence intensity (I) of bacteriophage particles (+/-6%); 3) as a fragment forms, a progressive decrease occurs in both R(H) and I. The decrease in I follows a pattern expected for intracapsid steric restriction of 4',6-diamidino-2-phenylindole (DAPI) binding to packaged DNA. The observed in vitro packaging of a concatemer's genomes always occurs in a synchronized cluster. Therefore, the following hypothesis is proposed: the observed packaging of concatemer-associated T7 genomes is cooperative.

  11. Defining adverse events in manual therapy: an exploratory qualitative analysis of the patient perspective.

    PubMed

    Carlesso, Lisa C; Cairney, John; Dolovich, Lisa; Hoogenes, Jennifer

    2011-10-01

    Rare, serious, and common, benign adverse events (AE) are associated with MT techniques. A proposed standard for defining AE in manual therapy (MT) practise has been published but it did not include the patient perspective. Research comparing clinician and patient reporting of AE demonstrates that several differences exist; for example, the reporting of objective versus subjective events. The objective of this study was to describe how patients define AE associated with MT techniques. A descriptive qualitative design was employed. Semi-structured interviews were used with a purposive sample of patients (n = 13) receiving MT, from physiotherapy, chiropractic and osteopathic practises in Ontario, Canada. The interview guide was informed by existing evidence and consultation with content and methodological experts. Interviews were audiotaped and transcribed verbatim. Date were analysed by two independent team members using thematic content analysis. A key finding was that patients defined mild, moderate and major AE by pain/symptom severity, functional impact, duration and by ruling out of alternative causes. An overarching theme identified multiple factors that influence how the AE is perceived. These concepts differ from the previously proposed framework for defining AE that did not include the patient perspective. Future processes to create standard definitions or measures should include the patient viewpoint to provide a broader, client-centred foundation.

  12. Forecasting and nowcasting process: A case study analysis of severe precipitation event in Athens, Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis; Nastos, Panagiotis; Avgoustoglou, Euripides; Gofa, Flora; Pytharoulis, Ioannis; Kamberakis, Nikolaos

    2016-04-01

    An early warning process is the result of interplay between the forecasting and nowcasting interactions. Therefore, (1) an accurate measurement and prediction of the spatial and temporal distribution of rainfall over an area and (2) the efficient and appropriate description of the catchment properties are important issues in atmospheric hazards (severe precipitation, flood, flash flood, etc.). In this paper, a forecasting and nowcasting analysis is presented, regarding a severe precipitation event that took place on September 21, 2015 in Athens, Greece. The severe precipitation caused a flash flood event at the suburbs of Athens, with significant impacts to the local society. Quantitative precipitation forecasts from European Centre for Medium-Range Weather Forecasts and from the COSMO.GR atmospheric model, including ensemble forecast of precipitation and probabilistic approaches are analyzed as tools in forecasting process. Satellite remote sensing data close and six hours prior to flash flood are presented, accompanied with radar products from Hellenic National Meteorological Service, illustrating the ability to depict the convection process.

  13. Computer simulation and discrete-event models in the analysis of a mammography clinic patient flow.

    PubMed

    Coelli, Fernando C; Ferreira, Rodrigo B; Almeida, Renan Moritz V R; Pereira, Wagner Coelho A

    2007-09-01

    This work develops a discrete-event computer simulation model for the analysis of a mammography clinic performance. Two mammography clinic computer simulation models were developed, based on an existing public sector clinic of the Brazilian Cancer Institute, located in Rio de Janeiro city, Brazil. Two clinics in a total of seven configurations (number of equipment units and working personnel) were studied. Models tried to simulate changes in patient arrival rates, number of equipment units, available personnel (technicians and physicians), equipment maintenance scheduling schemes and exam repeat rates. Model parameters were obtained by direct measurements and literature reviews. A commercially-available simulation software was used for model building. The best patient scheduling (patient arrival rate) for the studied configurations had an average of 29 min for Clinic 1 (consisting of one mammography equipment, one to three technicians and one physician) and 21 min for Clinic 2 (two mammography equipment units, one to four technicians and one physician). The exam repeat rates and equipment maintenance scheduling simulations indicated that a large impact over patient waiting time would appear in the smaller capacity configurations. Discrete-event simulation was a useful tool for defining optimal operating conditions for the studied clinics, indicating the most adequate capacity configurations and equipment maintenance schedules.

  14. Development of single-event-effects analysis system at the IMP microbeam facility

    NASA Astrophysics Data System (ADS)

    Guo, Jinlong; Du, Guanghua; Bi, Jinshun; Liu, Wenjing; Wu, Ruqun; Chen, Hao; Wei, Junze; Li, Yaning; Sheng, Lina; Liu, Xiaojun; Ma, Shuyi

    2017-08-01

    Single-event-effects (SEEs) in integrated circuits (ICs) caused by galactic single ions are the major cause of anomalies for a spacecraft. The main strategies to decrease radiation failures for spacecraft are using SEEs less-sensitive devices and design radiation hardened ICs. High energy ion microbeam is one of the powerful tools to obtain spatial information of SEEs in ICs and to guide the radiation hardening design. The microbeam facility in the Institute of Modern Physics (IMP), Chinese Academy of Science (CAS) can meet both the liner energy transfer (LET) and ion range requirements for SEEs simulation experiments on ground. In order to study SEEs characteristics of ICs at this microbeam platform, a SEEs analysis system was developed. This system can target and irradiate ICs with single ions in micrometer-scale accuracy, meanwhile it acquires multi-channel SEE signals and maps the SEE sensitive regions online. A 4-Mbit NOR Flash memory was tested with this system using 2.2 GeV Kr ions, the radiation sensitive peripheral circuit regions for SEEs of 1 to 0 and 0 to 1 upset, multi-bit-upset and single event latchup have been obtained.

  15. Error Analysis of Satellite Precipitation-Driven Modeling of Complex Terrain Flood Events

    NASA Astrophysics Data System (ADS)

    Mei, Y.; Nikolopoulos, E. I.; Anagnostou, E. N.; Zoccatelli, D.; Borga, M., Sr.

    2015-12-01

    The error characteristics of satellite precipitation driven flood event simulations over mountainous basins are evaluated in this study for eight different global satellite products. A methodology is devised to match the observed records of the flood events with the corresponding satellite and reference rainfall and runoff simulations. The flood events are sorted according to flood type (i.e. rain flood and flash flood) and basin's antecedent conditions represented by the event's runoff-to-precipitation ratio. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. cumulative volume, timing and shape). Overall satellite-driven event runoff exhibits better error metrics than the satellite precipitation. Better error metrics are also shown for the rain flood events relative to the flash flood events. The event timing and shape from satellite-derived precipitation agreed well with the reference; the cumulative volume is mostly underestimated. In terms of error propagation, the study shows dampening effect in both systematic and random error components of the satellite-driven runoff time series relative to the satellite-retrieved event precipitation. This error dampening effect is less pronounced for the flash flood events and the rain flood events with high runoff coefficients. This study provides for a first time flood event characteristics of the satellite precipitation error propagation in flood modeling, which has implications on the Global Precipitation Measurement application in mountain flood hydrology.

  16. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  17. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  18. Localization of the event-related potential novelty response as defined by principal components analysis.

    PubMed

    Dien, Joseph; Spencer, Kevin M; Donchin, Emanuel

    2003-10-01

    Recent research indicates that novel stimuli elicit at least two distinct components, the Novelty P3 and the P300. The P300 is thought to be elicited when a context updating mechanism is activated by a wide class of deviant events. The functional significance of the Novelty P3 is uncertain. Identification of the generator sources of the two components could provide additional information about their functional significance. Previous localization efforts have yielded conflicting results. The present report demonstrates that the use of principal components analysis (PCA) results in better convergence with knowledge about functional neuroanatomy than did previous localization efforts. The results are also more convincing than that obtained by two alternative methods, MUSIC-RAP and the Minimum Norm. Source modeling on 129-channel data with BESA and BrainVoyager suggests the P300 has sources in the temporal-parietal junction whereas the Novelty P3 has sources in the anterior cingulate.

  19. Migration experience and premarital sexual initiation in urban Kenya: an event history analysis.

    PubMed

    Luke, Nancy; Xu, Hongwei; Mberu, Blessing U; Goldberg, Rachel E

    2012-06-01

    Migration during the formative adolescent years can affect important life-course transitions, including the initiation of sexual activity. In this study, we use life history calendar data to investigate the relationship between changes in residence and timing of premarital sexual debut among young people in urban Kenya. By age 18, 64 percent of respondents had initiated premarital sex, and 45 percent had moved at least once between the ages of 12 and 18. Results of the event history analysis show that girls and boys who move during early adolescence experience the earliest onset of sexual activity. For adolescent girls, however, other dimensions of migration provide protective effects, with greater numbers of residential changes and residential changes in the last one to three months associated with later sexual initiation. To support young people's ability to navigate the social, economic, and sexual environments that accompany residential change, researchers and policymakers should consider how various dimensions of migration affect sexual activity.

  20. System-level analysis of single event upset susceptibility in RRAM architectures

    NASA Astrophysics Data System (ADS)

    Liu, Rui; Barnaby, Hugh J.; Yu, Shimeng

    2016-12-01

    In this work, the single event upset susceptibility of a resistive random access memory (RRAM) system with 1-transistor-1-resistor (1T1R) and crossbar architectures to heavy ion strikes is investigated from the circuit-level to the system-level. From a circuit-level perspective, the 1T1R is only susceptible to single-bit-upset (SBU) due to the isolation of cells, while in the crossbar, multiple-bit-upsets may occur because ion-induced voltage spikes generated on drivers may propagate along rows or columns. Three factors are considered to evaluate system-level susceptibility: the upset rate, the sensitive area, and the vulnerable time window. Our analysis indicates that the crossbar architecture has a smaller maximum bit-error-rate per day as compared to the 1T1R architecture for a given sub-array size, I/O width and susceptible time window.

  1. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    SciTech Connect

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-11-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident.

  2. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    NASA Technical Reports Server (NTRS)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  3. Efficacy of forensic statement analysis in distinguishing truthful from deceptive eyewitness accounts of highly stressful events.

    PubMed

    Morgan, Charles A; Colwell, Kevin; Hazlett, Gary A

    2011-09-01

    Laboratory-based detecting deception research suggests that truthful statements differ from those of deceptive statements. This nonlaboratory study tested whether forensic statement analysis (FSA) methods would distinguish genuine from false eyewitness accounts about exposure to a highly stressful event. A total of 35 military participants were assigned to truthful or deceptive eyewitness conditions. Genuine eyewitness reported truthfully about exposure to interrogation stress. Deceptive eyewitnesses studied transcripts of genuine eyewitnesses for 24 h and falsely claimed they had been interrogated. Cognitive Interviews were recorded, transcribed, and assessed by FSA raters blind to the status of participants. Genuine accounts contained more unique words, external and contextual referents, and a greater total word count than did deceptive statements. The type-token ratio was lower in genuine statements. The classification accuracy using FSA techniques was 82%. FSA methods may be effective in real-world circumstances and have relevance to professionals in law enforcement, security, and criminal justice.

  4. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    SciTech Connect

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  5. Event-related EEG time-frequency analysis and the Orienting Reflex to auditory stimuli.

    PubMed

    Barry, Robert J; Steiner, Genevieve Z; De Blasio, Frances M

    2012-06-01

    Sokolov's classic works discussed electroencephalogram (EEG) alpha desynchronization as a measure of the Orienting Reflex (OR). Early studies confirmed that this reduced with repeated auditory stimulation, but without reliable stimulus-significance effects. We presented an auditory habituation series with counterbalanced indifferent and significant (counting) instructions. Time-frequency analysis of electrooculogram (EOG)-corrected EEG was used to explore prestimulus levels and the timing and amplitude of event-related increases and decreases in 4 classic EEG bands. Decrement over trials and response recovery were substantial for the transient increase (in delta, theta, and alpha) and subsequent desynchronization (in theta, alpha, and beta). There was little evidence of dishabituation and few effects of counting. Expected effects in stimulus-induced alpha desynchronization were confirmed. Two EEG response patterns over trials and conditions, distinct from the full OR pattern, warrant further research.

  6. Time-series analysis for rapid event-related skin conductance responses

    PubMed Central

    Bach, Dominik R.; Flandin, Guillaume; Friston, Karl J.; Dolan, Raymond J.

    2009-01-01

    Event-related skin conductance responses (SCRs) are traditionally analysed by comparing the amplitude of individual peaks against a pre-stimulus baseline. Many experimental manipulations in cognitive neuroscience dictate paradigms with short inter trial intervals, precluding accurate baseline estimation for SCR measurements. Here, we present a novel and general approach to SCR analysis, derived from methods used in neuroimaging that estimate responses using a linear convolution model. In effect, the method obviates peak-scoring and makes use of the full SCR. We demonstrate, across three experiments, that the method has face validity in analysing reactions to a loud white noise and emotional pictures, can be generalised to paradigms where the shape of the response function is unknown and can account for parametric trial-by-trial effects. We suggest our approach provides greater flexibility in analysing SCRs than existing methods. PMID:19686778

  7. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    SciTech Connect

    Anderson, Johan; Halpern, Federico D.; Ricci, Paolo; Furno, Ivo; Xanthopoulos, Pavlos

    2014-12-15

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis of the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.

  8. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  9. Uncertainty Analysis for a De-pressurised Loss of Forced Cooling Event of the PBMR Reactor

    SciTech Connect

    Jansen van Rensburg, Pieter A.; Sage, Martin G.

    2006-07-01

    This paper presents an uncertainty analysis for a De-pressurised Loss of Forced Cooling (DLOFC) event that was performed with the systems CFD (Computational Fluid Dynamics) code Flownex for the PBMR reactor. An uncertainty analysis was performed to determine the variation in maximum fuel, core barrel and reactor pressure vessel (RPV) temperature due to variations in model input parameters. Some of the input parameters that were varied are: thermo-physical properties of helium and the various solid materials, decay heat, neutron and gamma heating, pebble bed pressure loss, pebble bed Nusselt number and pebble bed bypass flows. The Flownex model of the PBMR reactor is a 2-dimensional axisymmetrical model. It is simplified in terms of geometry and some other input values. However, it is believed that the model adequately indicates the effect of changes in certain input parameters on the fuel temperature and other components during a DLOFC event. Firstly, a sensitivity study was performed where input variables were varied individually according to predefined uncertainty ranges and the results were sorted according to the effect on maximum fuel temperature. In the sensitivity study, only seven variables had a significant effect on the maximum fuel temperature (greater that 5 deg. C). The most significant are power distribution profile, decay heat, reflector properties and effective pebble bed conductivity. Secondly, Monte Carlo analyses were performed in which twenty variables were varied simultaneously within predefined uncertainty ranges. For a one-tailed 95% confidence level, the conservatism that should be added to the best estimate calculation of the maximum fuel temperature for a DLOFC was determined as 53 deg. C. This value will probably increase after some model refinements in the future. Flownex was found to be a valuable tool for uncertainly analyses, facilitating both sensitivity studies and Monte Carlo analyses. (authors)

  10. Assessment of (sub-) seasonal prediction skill using a canonical event analysi

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Wood, E. F.

    2015-12-01

    Hydrological extremes regularly occur in all regions of the world and as such are globally relevant phenomena with large impacts on society. Seasonal and sub-seasonal predictions could increase the preparedness to these extreme events. We investigated the skill of five seasonal forecast models from the NMME-II ensemble for the period 1982-2012 at a range of temporal and spatial scales. A canonical event analysis is used to enable a model validation beyond the ¨single¨ temporal and spatial scale. The model predictions are compared to two reference datasets on the seasonal and sub-seasonal scale. We evaluate their capability to reproduce observed daily precipitation and temperature. It is shown that the skill of the models is largely dependent on the temporal aggregation and the lead time. Longer temporal aggregation increases the forecast skill of both precipitation and temperature. Seasonal precipitation forecasts show no skill beyond lead time of 6 months, while seasonal temperature forecasts skill does extent beyond the 6 months. Overall the highest skill can be found over South-America and Australia, whereas the skill over Europe and North-America is relatively low for both variables. On the sub-seasonal scale (two week aggregation) we find a strong decrease in prediction skill after the first 2 weeks of initialization. However, the models retain skill up to 1-2 months for precipitation and 3-4 months for temperature. Their skill is highest in South-America, Asia and Oceania at the sub-seasonal level. The skill amongst models differs greatly for both the sub-seasonal and seasonal forecasts, indicating that a (weighted) multi-model ensemble is preferred over single model forecasts. This work shows that an analysis at multiple temporal and spatial scales can enhance our understanding of the added value of (sub-) seasonal forecast models and their applicability, which is important when these models are applied to forecasting of (hydrological) extremes.

  11. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    NASA Astrophysics Data System (ADS)

    Trigo, Ricardo; Varino, Filipa; Ramos, Alexandre; Valente, Maria; Zêzere, José; Vaquero, José; Gouveia, Célia; Russo, Ana

    2014-04-01

    The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora), present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over central Atlantic Ocean.

  12. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database

    PubMed Central

    Hosoya, Ryuichiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher’s exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the

  13. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains.

    PubMed

    Torre, Emiliano; Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz; Grün, Sonja

    2016-07-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity.

  14. Analysis and Prediction of West African Moist Events during the Boreal Spring of 2009

    NASA Astrophysics Data System (ADS)

    Mera, Roberto Javier

    Weather and climate in Sahelian West Africa are dominated by two major wind systems, the southwesterly West African Monsoon (WAM) and the northeasterly (Harmattan) trade winds. In addition to the agricultural benefit of the WAM, the public health sector is affected given the relationship between the onset of moisture and end of meningitis outbreaks. Knowledge and prediction of moisture distribution during the boreal spring is vital to the mitigation of meningitis by providing guidance for vaccine dissemination. The goal of the present study is to (a) develop a climatology and conceptual model of the moisture regime during the boreal spring, (b) investigate the role of extra-tropical and Convectively-coupled Equatorial Waves (CCEWs) on the modulation of westward moving synoptic waves and (c) determine the efficacy of a regional model as a tool for predicting moisture variability. Medical reports during 2009, along with continuous meteorological observations at Kano, Nigeria, showed that the advent of high humidity correlated with cessation of the disease. Further analysis of the 2009 boreal spring elucidated the presence of short-term moist events that modulated surface moisture on temporal scales relevant to the health sector. The May moist event (MME) provided insight into interplays among climate anomalies, extra-tropical systems, equatorially trapped waves and westward-propagating synoptic disturbances. The synoptic disturbance initiated 7 May and traveled westward to the coast by 12 May. There was a marked, semi-stationary moist anomaly in the precipitable water field (kg m-2) east of 10°E through late April and early May, that moved westward at the time of the MME. Further inspection revealed a mid-latitude system may have played a role in increasing the latitudinal amplitude of the MME. CCEWs were also found to have an impact on the MME. A coherent Kelvin wave propagated through the region, providing increased monsoonal flow and heightened convection. A

  15. Top-down and bottom-up definitions of human failure events in human reliability analysis

    SciTech Connect

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  16. Potential of Breastmilk Analysis to Inform Early Events in Breast Carcinogenesis: Rationale and Considerations

    PubMed Central

    Murphy, Jeanne; Sherman, Mark E.; Browne, Eva P.; Caballero, Ana I.; Punska, Elizabeth C.; Pfeiffer, Ruth M.; Yang, Hannah P.; Lee, Maxwell; Yang, Howard; Gierach, Gretchen L.; Arcaro, Kathleen F.

    2016-01-01

    This review summarizes methods related to the study of human breastmilk in etiologic and biomarkers research. Despite the importance of reproductive factors in breast carcinogenesis, factors that act early in life are difficult to study because young women rarely require breast imaging or biopsy, and analysis of critical circulating factors (e.g. hormones) is often complicated by the requirement to accurately account for menstrual cycle date. Accordingly, novel approaches are needed to understand how events such as pregnancy, breastfeeding, weaning, and post-weaning breast remodeling influence breast cancer risk. Analysis of breastmilk offers opportunities to understand mechanisms related to carcinogenesis in the breast, and to identify risk markers that may inform efforts to identify high-risk women early in the carcinogenic process. In addition, analysis of breastmilk could have value in early detection or diagnosis of breast cancer. In this article we describe the potential for using breastmilk to characterize the microenvironment of the lactating breast with the goal of advancing research on risk assessment, prevention, and detection of breast cancer. PMID:27107568

  17. Potential of breastmilk analysis to inform early events in breast carcinogenesis: rationale and considerations.

    PubMed

    Murphy, Jeanne; Sherman, Mark E; Browne, Eva P; Caballero, Ana I; Punska, Elizabeth C; Pfeiffer, Ruth M; Yang, Hannah P; Lee, Maxwell; Yang, Howard; Gierach, Gretchen L; Arcaro, Kathleen F

    2016-05-01

    This review summarizes methods related to the study of human breastmilk in etiologic and biomarkers research. Despite the importance of reproductive factors in breast carcinogenesis, factors that act early in life are difficult to study because young women rarely require breast imaging or biopsy, and analysis of critical circulating factors (e.g., hormones) is often complicated by the requirement to accurately account for menstrual cycle date. Accordingly, novel approaches are needed to understand how events such as pregnancy, breastfeeding, weaning, and post-weaning breast remodeling influence breast cancer risk. Analysis of breastmilk offers opportunities to understand mechanisms related to carcinogenesis in the breast, and to identify risk markers that may inform efforts to identify high-risk women early in the carcinogenic process. In addition, analysis of breastmilk could have value in early detection or diagnosis of breast cancer. In this article, we describe the potential for using breastmilk to characterize the microenvironment of the lactating breast with the goal of advancing research on risk assessment, prevention, and detection of breast cancer.

  18. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  19. Frequency analysis and its spatiotemporal characteristics of precipitation extreme events in China during 1951-2010

    NASA Astrophysics Data System (ADS)

    Shao, Yuehong; Wu, Junmei; Ye, Jinyin; Liu, Yonghe

    2015-08-01

    This study investigates frequency analysis and its spatiotemporal characteristics of precipitation extremes based on annual maximum of daily precipitation (AMP) data of 753 observation stations in China during the period 1951-2010. Several statistical methods including L-moments, Mann-Kendall test (MK test), Student's t test ( t test) and analysis of variance ( F-test) are used to study different statistical properties related to frequency and spatiotemporal characteristics of precipitation extremes. The results indicate that the AMP series of most sites have no linear trends at 90 % confidence level, but there is a distinctive decrease trend in Beijing-Tianjin-Tangshan region. The analysis of abrupt changes shows that there are no significant changes in most sites, and no distinctive regional patterns within the mutation sites either. An important innovation different from the previous studies is the shift in the mean and the variance which are also studied in this paper in order to further analyze the changes of strong and weak precipitation extreme events. The shift analysis shows that we should pay more attention to the drought in North China and to the flood control and drought in South China, especially to those regions that have no clear trend and have a significant shift in the variance. More important, this study conducts the comprehensive analysis of a complete set of quantile estimates and its spatiotemporal characteristic in China. Spatial distribution of quantile estimation based on the AMP series demonstrated that the values gradually increased from the Northwest to the Southeast with the increment of duration and return period, while the increasing rate of estimation is smooth in the arid and semiarid region and is rapid in humid region. Frequency estimates of 50-year return period are in agreement with the maximum observations of AMP series in the most stations, which can provide more quantitative and scientific basis for decision making.

  20. A Behavior Genetic Analysis of Pleasant Events, Depressive Symptoms, and Their Covariation

    PubMed Central

    Whisman, Mark A.; Johnson, Daniel P.; Rhee, Soo Hyun

    2014-01-01

    Although pleasant events figure prominently in behavioral models of depression, little is known regarding characteristics that may predispose people to engage in pleasant events and derive pleasure from these events. The present study was conducted to evaluate genetic and environmental influences on the experience of pleasant events, depressive symptoms, and their covariation in a sample of 148 twin pairs. A multivariate twin modeling approach was used to examine the genetic and environmental covariance of pleasant events and depressive symptoms. Results indicated that the experience of pleasant events was moderately heritable and that the same genetic factors influence both the experience of pleasant events and depressive symptoms. These findings suggest that genetic factors may give rise to dispositional tendencies to experience both pleasant events and depression. PMID:25506045

  1. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  2. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  3. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    to maintain situational awareness of the status of the infrastructure elements (e.g., routers, intrusion detection systems, intrusion prevention...Event Logs and Event Entries There are several types of cyber event logs including, computers, servers, routers, firewalls, and intrusion ...Error, Information Network Security Devices Firewalls, Intrusion Detection Systems Allow and Deny Audit, Protocol usage, Traffic Log 2.3

  4. Economic impact and market analysis of a special event: The Great New England Air Show

    Treesearch

    Rodney B. Warnick; David C. Bojanic; Atul Sheel; Apurv Mather; Deepak. Ninan

    2010-01-01

    We conducted a post-event evaluation for the Great New England Air Show to assess its general economic impact and to refine economic estimates where possible. In addition to the standard economic impact variables, we examined travel distance, purchase decision involvement, event satisfaction, and frequency of attendance. Graphic mapping of event visitors' home ZIP...

  5. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    SciTech Connect

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. ); Baxter, J.T. ); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. ); Brosseau, D.A. )

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  6. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    NASA Astrophysics Data System (ADS)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  7. Retrospective Analysis of Recent Flood Events With Persistent High Surface Runoff From Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Joshi, S.; Hakeem, K. Abdul; Raju, P. V.; Rao, V. V.; Yadav, A.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    /locations with probable flooding conditions. These thresholds were refined through iterative process by comparing with satellite data derived flood maps of 2013 and 2014 monsoon season over India. India encountered many cyclonic flood events during Oct-Dec 2013, among which Phailin, Lehar, and Madi were rated to be very severe cyclonic storm. The path and intensity of these cyclonic events was very well captured by the model and areas were marked with persistent coverage of high runoff risk/flooded area. These thresholds were used to monitor floods in Jammu Kashmir during 4-5 Sep and Odisha during 8-9 Aug, 2014. The analysis indicated the need to vary the thresholds across space considering the terrain and geographical conditions. With respect to this a sub-basin wise study was made based on terrain characteristics (slope, elevation) using Aster DEM. It was found that basins with higher elevation represent higher thresholds as compared to basins with lesser elevation. The results show very promising correlation with the satellite derived flood maps. Further refinement and optimization of thresholds, varying them spatially accounting for topographic/terrain conditions, would lead to estimation of high runoff/flood risk areas for both riverine and drainage congested areas. Use of weather forecast data (NCMWRF, (GEFS/R)), etc. would enhance the scope to develop early warning systems.

  8. Rain-on-snow Events in Southwestern British Columbia: A Long-term Analysis of Meteorological Conditions and Snowpack Response

    NASA Astrophysics Data System (ADS)

    Trubilowicz, J. W.; Moore, D.

    2015-12-01

    Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.

  9. Analysis of the Lehman Brothers collapse and the Flash Crash event by applying wavelets methodologies

    NASA Astrophysics Data System (ADS)

    Beccar-Varela, Maria P.; Mariani, Maria C.; Tweneboah, Osei K.; Florescu, Ionut

    2017-05-01

    In this study, we apply a wavelet methodology initially developed for geophysical data to financial data. Specifically, the method distinguishes between natural tectonic earthquakes and man made explosions. We exemplify using time series data from two financial events: the Lehman Brothers collapse and the Flash Crash event. We conclude that the Lehman Brothers collapse behaves like a natural earthquake while the Flash Crash event behaves like a human made explosion. This study may imply that the Lehman Brothers type events may be predicted, while sudden Flash Crash type events are not predictable.

  10. Performance of the Pierre Auger Fluorescence Detector and Analysis of Well Reconstructed Events

    NASA Astrophysics Data System (ADS)

    Argiro, Stefano; Pierre Auger Collaboration

    2003-07-01

    The Pierre Auger Observatory is designed to elucidate the origin and nature of Ultra High Energy Cosmic Rays using a hybrid detection technique. A first run of data taking with a prototype version of both detectors (the so called Engineering Array) to ok place in 2001-2002, allowing the Collab oration to evaluate the performance of the two detector systems and to approach an analysis strategy. In this contribution, after a brief description of the system, we will report some results on the behavior of the Fluorescence Detector (FD) Prototype. Performance studies, such as measurements of noise, sensitivity and duty cycle, will be presented. We will illustrate a preliminary analysis of selected air showers. This analysis is performed using exclusively the information from the FD, and includes reconstruction of the shower geometry and of the longitudinal profile. Introduction The Pierre Auger Cosmic Ray observatory will be the largest cosmic ray detector ever built. Two sites of approximately 3000 km2 , one in each hemisphere, will be instrumented with a surface detector and a set of fluorescence detectors. Two fluorescence telescope units were operated from December 2001 to March 2002 in conjunction with 32 surface detectors, the so-called Engineering Array. This phase of the project was aimed at proving the validity of the design and probing the potential of the system. In the following we will show an analysis of the performance of the FD during this run and demonstrate, by investigating selected events, the ability to reconstruct geometry and the longitudinal profile of Extensive Air Showers. System Overview Figure 1. shows a schematic view of a fluorescence telescope unit. An array of 20×22 hexagonal photomultiplier tubes (the camera ) is mounted on a quasispherical support located at the fo cal surface of a segmented mirror [1]. Each PMT overlo oks a region of the sky of 1.5 deg in diameter. The telescope aperture

  11. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    PubMed

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  12. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    SciTech Connect

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  13. Relationship Between Gastrointestinal Events and Compliance With Osteoporosis Therapy: An Administrative Claims Analysis of the US Managed Care Population.

    PubMed

    Modi, Ankita; Sajjan, Shiva; Michael Lewiecki, E; Harris, Steven T; Papadopoulos Weaver, Jessica

    2016-05-01

    A large proportion of women with osteoporosis do not comply with current osteoporosis therapies, resulting in diminished therapeutic effect. Noncompliance may be due to the occurrence of gastrointestinal (GI) events during the course of therapy. The objective of this study was to estimate the rate of GI events among women taking oral bisphosphonates and to determine the association between GI events and compliance with bisphosphonate therapy. This was a retrospective analysis of data from a US Medicare claims database (HUMANA). The study period was from January 2007 to June 2013. The index date was the date of the first oral bisphosphonate prescription (alendronate, ibandronate, or risedronate) occurring between January 2008 and June 2012. The pre- and postindex periods were the 1-year periods before and after the index date, respectively. The analysis included women 65 years of age and older who were naïve to all osteoporosis treatments before the index date. GI events included nausea/vomiting; dysphagia; esophagitis; esophageal reflux; esophageal, gastric, duodenal, and peptic ulcer; stricture, perforation, or hemorrhage of the esophagus; acute gastritis; and GI hemorrhage. GI events were assessed during the preindex period and at 3, 6, and 12 months in the postindex period. Compliance was defined as a medication possession ratio of ≥80%. The medication possession ratio was calculated as the total days׳ supply of bisphosphonate in the postindex period divided by 365 days. The association of postindex GI events with compliance was assessed using multivariate logistic regression. The analysis included 37,886 women initiating oral bisphosphonate therapy. In the preindex year, 37.5% of the women experienced a GI event, and in the postindex year, 38.9% had a GI event. Patients with preindex GI events had numerically higher rates of postindex GI events than patients without preindex GI events (61.8% vs 25.1% at 12 months postindex). Patients who experienced

  14. Exact meta-analysis approach for discrete data and its application to 2 × 2 tables with rare events

    PubMed Central

    Liu, Dungang; Liu, Regina Y.

    2014-01-01

    This paper proposes a general exact meta-analysis approach for synthesizing inferences from multiple studies of discrete data. The approach combines the p-value functions (also known as significance functions) associated with the exact tests from individual studies. It encompasses a broad class of exact meta-analysis methods, as it permits broad choices for the combining elements, such as tests used in individual studies, and any parameter of interest. The approach yields statements that explicitly account for the impact of individual studies on the overall inference, in terms of efficiency/power and the type I error rate. Those statements also give rises to empirical methods for further enhancing the combined inference. Although the proposed approach is for general discrete settings, for convenience, it is illustrated throughout using the setting of meta-analysis of multiple 2 × 2 tables. In the context of rare events data, such as observing few, zero or zero total (i.e., zero events in both arms) outcomes in binomial trials or 2 × 2 tables, most existing meta-analysis methods rely on the large-sample approximations which may yield invalid inference. The commonly used corrections to zero outcomes in rare events data, aiming to improve numerical performance can also incur undesirable consequences. The proposed approach applies readily to any rare event setting, including even the zero total event studies without any artificial correction. While debates continue on whether or how zero total event studies should be incorporated in meta-analysis, the proposed approach has the advantage of automatically including those studies and thus making use of all available data. Through numerical studies in rare events settings, the proposed exact approach is shown to be efficient and, generally, outperform commonly used meta-analysis methods, including Mental-Haenszel and Peto methods. PMID:25620825

  15. Diagnostic evaluation of distributed physically based model at the REW scale (THREW) using rainfall-runoff event analysis

    NASA Astrophysics Data System (ADS)

    Tian, F.; Sivapalan, M.; Li, H.; Hu, H.

    2007-12-01

    The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of

  16. SYSTEMS SAFETY ANALYSIS FOR FIRE EVENTS ASSOCIATED WITH THE ECRB CROSS DRIFT

    SciTech Connect

    R. J. Garrett

    2001-12-12

    The purpose of this analysis is to systematically identify and evaluate fire hazards related to the Yucca Mountain Site Characterization Project (YMP) Enhanced Characterization of the Repository Block (ECRB) East-West Cross Drift (commonly referred to as the ECRB Cross-Drift). This analysis builds upon prior Exploratory Studies Facility (ESF) System Safety Analyses and incorporates Topopah Springs (TS) Main Drift fire scenarios and ECRB Cross-Drift fire scenarios. Accident scenarios involving the fires in the Main Drift and the ECRB Cross-Drift were previously evaluated in ''Topopah Springs Main Drift System Safety Analysis'' (CRWMS M&O 1995) and the ''Yucca Mountain Site Characterization Project East-West Drift System Safety Analysis'' (CRWMS M&O 1998). In addition to listing required mitigation/control features, this analysis identifies the potential need for procedures and training as part of defense-in-depth mitigation/control features. The inclusion of this information in the System Safety Analysis (SSA) is intended to assist the organization(s) (e.g., Construction, Environmental Safety and Health, Design) responsible for these aspects of the ECRB Cross-Drift in developing mitigation/control features for fire events, including Emergency Refuge Station(s). This SSA was prepared, in part, in response to Condition/Issue Identification and Reporting/Resolution System (CIRS) item 1966. The SSA is an integral part of the systems engineering process, whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach is used which incorporates operating experiences and recommendations from vendors, the constructor and the operating contractor. The risk assessment in this analysis characterizes the scenarios associated with fires in terms of relative risk and includes recommendations for mitigating all identified hazards. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures

  17. Causal effects of body mass index on cardiometabolic traits and events: a Mendelian randomization analysis.

    PubMed

    Holmes, Michael V; Lange, Leslie A; Palmer, Tom; Lanktree, Matthew B; North, Kari E; Almoguera, Berta; Buxbaum, Sarah; Chandrupatla, Hareesh R; Elbers, Clara C; Guo, Yiran; Hoogeveen, Ron C; Li, Jin; Li, Yun R; Swerdlow, Daniel I; Cushman, Mary; Price, Tom S; Curtis, Sean P; Fornage, Myriam; Hakonarson, Hakon; Patel, Sanjay R; Redline, Susan; Siscovick, David S; Tsai, Michael Y; Wilson, James G; van der Schouw, Yvonne T; FitzGerald, Garret A; Hingorani, Aroon D; Casas, Juan P; de Bakker, Paul I W; Rich, Stephen S; Schadt, Eric E; Asselbergs, Folkert W; Reiner, Alex P; Keating, Brendan J

    2014-02-06

    Elevated body mass index (BMI) associates with cardiometabolic traits on observational analysis, yet the underlying causal relationships remain unclear. We conducted Mendelian randomization analyses by using a genetic score (GS) comprising 14 BMI-associated SNPs from a recent discovery analysis to investigate the causal role of BMI in cardiometabolic traits and events. We used eight population-based cohorts, including 34,538 European-descent individuals (4,407 type 2 diabetes (T2D), 6,073 coronary heart disease (CHD), and 3,813 stroke cases). A 1 kg/m(2) genetically elevated BMI increased fasting glucose (0.18 mmol/l; 95% confidence interval (CI) = 0.12-0.24), fasting insulin (8.5%; 95% CI = 5.9-11.1), interleukin-6 (7.0%; 95% CI = 4.0-10.1), and systolic blood pressure (0.70 mmHg; 95% CI = 0.24-1.16) and reduced high-density lipoprotein cholesterol (-0.02 mmol/l; 95% CI = -0.03 to -0.01) and low-density lipoprotein cholesterol (LDL-C; -0.04 mmol/l; 95% CI = -0.07 to -0.01). Observational and causal estimates were directionally concordant, except for LDL-C. A 1 kg/m(2) genetically elevated BMI increased the odds of T2D (odds ratio [OR] = 1.27; 95% CI = 1.18-1.36) but did not alter risk of CHD (OR 1.01; 95% CI = 0.94-1.08) or stroke (OR = 1.03; 95% CI = 0.95-1.12). A meta-analysis incorporating published studies reporting 27,465 CHD events in 219,423 individuals yielded a pooled OR of 1.04 (95% CI = 0.97-1.12) per 1 kg/m(2) increase in BMI. In conclusion, we identified causal effects of BMI on several cardiometabolic traits; however, whether BMI causally impacts CHD risk requires further evidence. Copyright © 2014 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  18. Causal Effects of Body Mass Index on Cardiometabolic Traits and Events: A Mendelian Randomization Analysis

    PubMed Central

    Holmes, Michael V.; Lange, Leslie A.; Palmer, Tom; Lanktree, Matthew B.; North, Kari E.; Almoguera, Berta; Buxbaum, Sarah; Chandrupatla, Hareesh R.; Elbers, Clara C.; Guo, Yiran; Hoogeveen, Ron C.; Li, Jin; Li, Yun R.; Swerdlow, Daniel I.; Cushman, Mary; Price, Tom S.; Curtis, Sean P.; Fornage, Myriam; Hakonarson, Hakon; Patel, Sanjay R.; Redline, Susan; Siscovick, David S.; Tsai, Michael Y.; Wilson, James G.; van der Schouw, Yvonne T.; FitzGerald, Garret A.; Hingorani, Aroon D.; Casas, Juan P.; de Bakker, Paul I.W.; Rich, Stephen S.; Schadt, Eric E.; Asselbergs, Folkert W.; Reiner, Alex P.; Keating, Brendan J.

    2014-01-01

    Elevated body mass index (BMI) associates with cardiometabolic traits on observational analysis, yet the underlying causal relationships remain unclear. We conducted Mendelian randomization analyses by using a genetic score (GS) comprising 14 BMI-associated SNPs from a recent discovery analysis to investigate the causal role of BMI in cardiometabolic traits and events. We used eight population-based cohorts, including 34,538 European-descent individuals (4,407 type 2 diabetes (T2D), 6,073 coronary heart disease (CHD), and 3,813 stroke cases). A 1 kg/m2 genetically elevated BMI increased fasting glucose (0.18 mmol/l; 95% confidence interval (CI) = 0.12–0.24), fasting insulin (8.5%; 95% CI = 5.9–11.1), interleukin-6 (7.0%; 95% CI = 4.0–10.1), and systolic blood pressure (0.70 mmHg; 95% CI = 0.24–1.16) and reduced high-density lipoprotein cholesterol (−0.02 mmol/l; 95% CI = −0.03 to −0.01) and low-density lipoprotein cholesterol (LDL-C; −0.04 mmol/l; 95% CI = −0.07 to −0.01). Observational and causal estimates were directionally concordant, except for LDL-C. A 1 kg/m2 genetically elevated BMI increased the odds of T2D (odds ratio [OR] = 1.27; 95% CI = 1.18–1.36) but did not alter risk of CHD (OR 1.01; 95% CI = 0.94–1.08) or stroke (OR = 1.03; 95% CI = 0.95–1.12). A meta-analysis incorporating published studies reporting 27,465 CHD events in 219,423 individuals yielded a pooled OR of 1.04 (95% CI = 0.97–1.12) per 1 kg/m2 increase in BMI. In conclusion, we identified causal effects of BMI on several cardiometabolic traits; however, whether BMI causally impacts CHD risk requires further evidence. PMID:24462370

  19. Multivariate statistical modelling of compound events via pair-copula constructions: analysis of floods in Ravenna (Italy)

    NASA Astrophysics Data System (ADS)

    Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu

    2017-06-01

    Compound events (CEs) are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. Conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present-day and future climate, as well as the uncertainty estimates around such risk. The model includes predictors, which could represent for instance meteorological processes that provide insight into both the involved physical mechanisms and the temporal variability of compound events. Moreover, this model enables multivariate statistical downscaling of compound events. Downscaling is required to extend the compound events' risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis, observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk; in particular, the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.

  20. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves.

    PubMed

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-22

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  1. Full genomic analysis of new variant rabbit hemorrhagic disease virus revealed multiple recombination events.

    PubMed

    Lopes, Ana M; Dalton, Kevin P; Magalhães, Maria J; Parra, Francisco; Esteves, Pedro J; Holmes, Edward C; Abrantes, Joana

    2015-06-01

    Rabbit hemorrhagic disease virus (RHDV), a Lagovirus of the family Caliciviridae, causes rabbit hemorrhagic disease (RHD) in the European rabbit (Oryctolagus cuniculus). The disease was first documented in 1984 in China and rapidly spread worldwide. In 2010, a new RHDV variant emerged, tentatively classified as 'RHDVb'. RHDVb is characterized by affecting vaccinated rabbits and those <2 months old, and is genetically distinct (~20 %) from older strains. To determine the evolution of RHDV, including the new variant, we generated 28 full-genome sequences from samples collected between 1994 and 2014. Phylogenetic analysis of the gene encoding the major capsid protein, VP60, indicated that all viruses sampled from 2012 to 2014 were RHDVb. Multiple recombination events were detected in the more recent RHDVb genomes, with a single major breakpoint located in the 5' region of VP60. This breakpoint divides the genome into two regions: one that encodes the non-structural proteins and another that encodes the major and minor structural proteins, VP60 and VP10, respectively. Additional phylogenetic analysis of each region revealed two types of recombinants with distinct genomic backgrounds. Recombinants always include the structural proteins of RHDVb, with non-structural proteins from non-pathogenic lagoviruses or from pathogenic genogroup 1 strains. Our results show that in contrast to the evolutionary history of older RHDV strains, recombination plays an important role in generating diversity in the newly emerged RHDVb. © 2015 The Authors.

  2. BIRD detection and analysis of high-temperature events: first results

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2003-03-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite, which was put in a 570 km circular sun-synchronous orbit on 22 October 2001, is detection and quantitative analysis of high-temperature events (HTE) like fires and volcanoes. A unique feature of the BIRD mid- and thermal infrared channels is a real-time adjustment of their integration time that allows a HTE observation without sensor saturation, preserving a good radiometric resolution of 0.1-0.2 K for pixels at normal temperatures. This makes it possible: (a) to improve false alarm rejection capability and (b) to estimate HTE temperature, area and radiative energy release. Due to a higher spatial resolution, BIRD can detect an order of magnitude smaller HTE than AVHRR and MODIS. The smallest verified fire that was detected in the BIRD data had an area of ~12 m2. The first BIRD HTE detection and analysis results are presented including bush fires in Australia, forest fires in Russia, coal seam fires in China, and a time-varying thermal activity at Etna.

  3. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    SciTech Connect

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  4. Transcriptome analysis of alternative splicing events regulated by SRSF10 reveals position-dependent splicing modulation.

    PubMed

    Zhou, Xuexia; Wu, Wenwu; Li, Huang; Cheng, Yuanming; Wei, Ning; Zong, Jie; Feng, Xiaoyan; Xie, Zhiqin; Chen, Dai; Manley, James L; Wang, Hui; Feng, Ying

    2014-04-01

    Splicing factor SRSF10 is known to function as a sequence-specific splicing activator. Here, we used RNA-seq coupled with bioinformatics analysis to identify the extensive splicing network regulated by SRSF10 in chicken cells. We found that SRSF10 promoted both exon inclusion and exclusion. Motif analysis revealed that SRSF10 binding to cassette exons was associated with exon inclusion, whereas the binding of SRSF10 within downstream constitutive exons was associated with exon exclusion. This positional effect was further demonstrated by the mutagenesis of potential SRSF10 binding motifs in two minigene constructs. Functionally, many of SRSF10-verified alternative exons are linked to pathways of stress and apoptosis. Consistent with this observation, cells depleted of SRSF10 expression were far more susceptible to endoplasmic reticulum stress-induced apoptosis than control cells. Importantly, reconstituted SRSF10 in knockout cells recovered wild-type splicing patterns and considerably rescued the stress-related defects. Together, our results provide mechanistic insight into SRSF10-regulated alternative splicing events in vivo and demonstrate that SRSF10 plays a crucial role in cell survival under stress conditions.

  5. 2005 Caribbean mass coral bleaching event: A sea surface temperature empirical orthogonal teleconnection analysis

    NASA Astrophysics Data System (ADS)

    Simonti, Alicia L.; Eastman, J. Ronald

    2010-11-01

    This study examined the effects of climate teleconnections on the massive Caribbean coral bleaching and mortality event of 2005. A relatively new analytical procedure known as empirical orthogonal teleconnection (EOT) analysis, based on a 26 year monthly time series of observed sea surface temperature (SST), was employed. Multiple regression analysis was then utilized to determine the relative teleconnection contributions to SST variability in the southern Caribbean. The results indicate that three independent climate teleconnections had significant impact on southern Caribbean anomalies in SST and that their interaction was a major contributor to the anomalously high temperatures in 2005. The primary and approximately equal contributors were EOT-5 and EOT-2, which correlate most strongly with the tropical North Atlantic (TNA) and Atlantic multidecadal oscillation (AMO) climate indices, respectively. The third, EOT-9, was most strongly related to the Atlantic meridional mode. However, although statistically significant, the magnitude of its contribution to southern Caribbean variability was small. While there is debate over the degree to which the recent AMO pattern represents natural variability or global ocean warming, the results presented here indicate that natural variability played a strong role in the 2005 coral bleaching conditions. They also argue for a redefinition of the geography of TNA variability.

  6. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves

    NASA Astrophysics Data System (ADS)

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-01

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f α noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  7. Motion based markerless gait analysis using standard events of gait and ensemble Kalman filtering.

    PubMed

    Vishnoi, Nalini; Mitra, Anish; Duric, Zoran; Gerber, Naomi Lynn

    2014-01-01

    We present a novel approach to gait analysis using ensemble Kalman filtering which permits markerless determination of segmental movement. We use image flow analysis to reliably compute temporal and kinematic measures including the translational velocity of the torso and rotational velocities of the lower leg segments. Detecting the instances where velocity changes direction also determines the standard events of a gait cycle (double-support, toe-off, mid-swing and heel-strike). In order to determine the kinematics of lower limbs, we model the synergies between the lower limb motions (thigh-shank, shank-foot) by building a nonlinear dynamical system using CMUs 3D motion capture database. This information is fed into the ensemble Kalman Filter framework to estimate the unobserved limb (upper leg and foot) motion from the measured lower leg rotational velocity. Our approach does not require calibrated cameras or special markers to capture movement. We have tested our method on different gait sequences collected from the sagttal plane and presented the estimated kinematics overlaid on the original image frames. We have also validated our approach by manually labeling the videos and comparing our results against them.

  8. Fractal analysis of GPS time series for early detection of disastrous seismic events

    NASA Astrophysics Data System (ADS)

    Filatov, Denis M.; Lyubushin, Alexey A.

    2017-03-01

    A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.

  9. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  10. ANTARES: The Arizona-NOAO Temporal Analysis and Response to Events System

    NASA Astrophysics Data System (ADS)

    Matheson, T.; Saha, A.; Snodgrass, R.; Kececioglu, J.

    The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. The goal is to build the software infrastructure necessary to process and filter alerts produced by time-domain surveys, with the ultimate source of such alerts being the Large Synoptic Survey Telescope (LSST). ANTARES will add value to alerts by annotating them with information from external sources such as previous surveys from across the electromagnetic spectrum. In addition, the temporal history of annotated alerts will provide further annotation for analysis. These alerts will go through a cascade of filters to select interesting candidates. For the prototype, 'interesting' is defined as the rarest or most unusual alert, but future systems will accommodate multiple filtering goals. The system is designed to be flexible, allowing users to access the stream at multiple points throughout the process, and to insert custom filters where necessary. We will describe the basic architecture of ANTARES and the principles that will guide development and implementation.

  11. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  12. Risk of venous and arterial thromboembolic events associated with VEGFR-TKIs: a meta-analysis.

    PubMed

    Liu, Bo; Ding, Fengxia; Zhang, Deying; Wei, Guang-Hui

    2017-07-10

    The reported incidence of arterial and venous thromboembolic events varies markedly between VEGFR-TKI-related clinical trials. Here, we performed a meta-analysis to determine the incidence and the relative risk (RR) of venous thromboembolism events (VTEs) and arterial thromboembolic events (ATEs) associated with these agents. Databases (PubMed, Web of Science) were searched for relevant studies. Statistical analyses were conducted to calculate the summary incidences, RRs and 95% confidence intervals (CIs) using either random-effects or fixed-effects models according to the heterogeneity of the included studies. A total of 24,855 patients from 48 studies were included. The overall incidence of all-grade and high-grade VTEs associated with VEGFR-TKIs was 3.6% (95% CI 2.3-5.2%) and 1.6% (95% CI 1.0-2.4%), respectively. The use of VEGFR-TKIs did not significantly increase the risk of developing all-grade (RR 0.91; 95% CI 0.68-1.22; P = 0.558) and high-grade (RR 1.05; 95% CI 0.84-1.31; P = 0.769) VTEs. The overall incidence of all-grade and high-grade ATEs associated with VEGFR-TKIs was 2.7% (95% CI 1.7-3.6%) and 0.6% (95% CI 0.2-1.2%), respectively. The use of VEGFR-TKIs significantly increase the risk of developing all-grade (RR 3.09; 95% CI 1.41-6.76; P = 0.033) ATEs, and a tendency to increase the risk of high-grade (RR 1.49; 95% CI 0.99-2.24; P = 0.101) ATEs was also detected. Patients with cancer that receive VEGFR-TKIs are at high risk of developing ATEs. Physicians should be aware of these adverse effects and should monitor cancer patients receiving VEGFR-TKIs.

  13. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  14. Analysis of an extremely dense regional fog event in Eastern China using a mesoscale model

    NASA Astrophysics Data System (ADS)

    Shi, Chune; Yang, Jun; Qiu, Mingyan; Zhang, Hao; Zhang, Su; Li, Zihua

    2010-03-01

    An unusually dense regional advection-radiation fog event over Anhui and the surrounding provinces in eastern China during Dec. 25-27, 2006, was investigated. At its mature stage, the fog covered most Anhui and parts of the surrounding provinces, reducing visibility to 100 m or less. It lasted more than 36 consecutive hours in some places. A mesoscale meteorological model (MM5), together with back-trajectory analysis, was used to investigate this fog event. The observations from a field station as well as hundreds of routine stations, along with two sets of visibility computing methods, were used to quantitatively and objectively validate the MM5 simulated liquid water content (LWC) and visibility. The verifications demonstrate that MM5 has a better fog predictability for the first day compared to the second day forecast, and better fog predictability compared to dense fog predictability with regard to the probability of detection (POD) and the threat score (TS). The new visibility algorithm that uses both LWC and number density of fog droplets significantly outperforms the conventional LWC-only based one in the fog prediction in terms of the POD score, especially for dense fog prediction. The objective verification in this work is the first time conducted for MM5 fog prediction, with which we can better understand the performance of simulated temporal and spatial fog coverage. The back-trajectory and sensitivity experiments confirm that subsidence and the steady warm and moist advections from southeast and southwest maintained the dense fog while the northwesterly dry wind resulted in dissipation of the fog.

  15. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  16. Rapid depressurization event analysis in BWR/6 using RELAP5 and contain

    SciTech Connect

    Mueftueoglu, A.K.; Feltus, M.A.

    1995-09-01

    Noncondensable gases may become dissolved in Boiling Water Reactor (BWR) water level instrumentation during normal operations. Any dissolved noncondensable gases inside these water columns may come out of solution during rapid depressurization events, and displace water from the reference leg piping resulting in a false high level. These water level errors may cause a delay or failure in actuation, or premature shutdown of the Emergency Core Cooling System. (ECCS). If a rapid depressurization causes an erroneously high water level, preventing automatic ECCS actuation, it becomes important to determine if there would be other adequate indications for operator response and other signals for automatic actuation such as high drywell pressure. It is also important to determine the effect of the level signal on ECCS operation after it is being actuated. The objective of this study is to determine the detailed coupled containment/NSSS response during this rapid depressurization events in BWR/6. The selected scenarios involve: (a) inadvertent opening of all ADS valves, (b) design basis (DB) large break loss of coolant accident (LOCA), and (c) main steam line break (MSLB). The transient behaviors are evaluated in terms of: (a) vessel pressure and collapsed water level response, (b) specific transient boundary conditions, (e.g., scram, MSIV closure timing, feedwater flow, and break blowdown rates), (c) ECCS initiation timing, (d) impact of operator actions, (e) whether indications besides low-low water level were available. The results of the analysis had shown that there would be signals to actuate ECCS other than low reactor level, such as high drywell pressure, low vessel pressure, high suppression pool temperature, and that the plant operators would have significant indications to actuate ECCS.

  17. Competing events and costs of clinical trials: Analysis of a randomized trial in prostate cancer.

    PubMed

    Zakeri, Kaveh; Rose, Brent S; D'Amico, Anthony V; Jeong, Jong-Hyeon; Mell, Loren K

    2015-04-01

    Clinical trial costs may be reduced by identifying enriched subpopulations of patients with favorable risk profiles for the events of interest. However, increased selectivity affects accrual rates, with uncertain impact on clinical trial cost. We conducted a secondary analysis of Southwest Oncology Group (SWOG) 8794 randomized trial of adjuvant radiotherapy for high-risk prostate cancer. The primary endpoint was metastasis-free survival (MFS), defined as time to metastasis or death from any cause (competing mortality). We used competing risks regression models to identify an enriched subgroup at high risk for metastasis and low risk for competing mortality. We applied a cost model to estimate the impact of enrichment on trial cost and duration. The treatment effect on metastasis was similar in the enriched subgroup (HR, 0.42; 95% CI, 0.23-0.76) compared to the whole cohort (HR, 0.50; 95% CI, 0.30-0.81) while the effect on competing mortality was not significant in the subgroup or the whole cohort (HR 0.70; 95% CI 0.39-1.23, vs. HR 0.94; 95% CI, 0.68-1.31). Due to the higher incidence of metastasis relative to competing mortality in the enriched subgroup, the treatment effect on MFS was greater in the subgroup compared to the whole cohort (HR 0.55; 95% CI 0.36-0.82, vs. HR 0.77; 95% CI, 0.58-1.01). Trial cost was 75% less in the subgroup compared to the whole cohort ($1.7 million vs. $6.8 million), and the trial duration was 30% shorter (8.4 vs. 12.0 years). Competing event enrichment can reduce clinical trial cost and duration, without sacrificing generalizability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Analysis and high-resolution modeling of a dense sea fog event over the Yellow Sea

    NASA Astrophysics Data System (ADS)

    Fu, Gang; Guo, Jingtian; Xie, Shang-Ping; Duan, Yihong; Zhang, Meigen

    2006-10-01

    A ubiquitous feature of the Yellow Sea (YS) is the frequent occurrence of the sea fog in spring and summer season. An extremely dense sea fog event was observed around the Shandong Peninsula in the morning of 11 April 2004. This fog patch, with a spatial scale of several hundreds kilometers and lasted about 20 h, reduced the horizontal visibility to be less than 20 m in some locations, and caused a series of traffic collisions and 12 injuries on the coastal stretch of a major highway. In this paper, almost all available observational data, including Geostationary Operational Environmental Satellite (GOES)-9 visible satellite imagery, objectively reanalyzed data of final run analysis (FNL) issued by the National Center for Environmental Prediction (NCEP) and the sounding data of Qingdao and Dalian, as well as the latest 4.4 version of Regional Atmospheric Modeling System (RAMS) model, were employed to investigate this sea fog case. Its evolutionary process and the environmental conditions that led to the fog formation were examined by using GOES-9 visible satellite imagery and sounding observations. In order to better understand the fog formation mechanism, a high-resolution RAMS modeling of 4 km × 4 km was designed. The modeling was initialized and validated by FNL data. A 30-h modeling that started from 18 UTC 10 April 2004 reproduced the main characteristics of this fog event. The simulated lower horizontal visibility area agreed reasonably well with the sea fog region identified from the satellite imagery. Advection cooling effect seemed to play a significant role for the fog formation.

  19. Effects of Tube Rupture Modeling and Parameters on Analysis of MSGTR Event Progression in PWR

    SciTech Connect

    Jeong, Ji Hwan; Choi, Ki Yong; Chang, Keun Sun; Kweon, Young Chel

    2002-07-01

    A multiple steam generator tube rupture (MSGTR) event in APR1400 has been investigated using the best estimate thermal hydraulic system code, MARS1.4. The effects of parameters such as the number of ruptured tubes, rupture location, affected steam generator on analysis of the MSGTR event in APR1400 is examined. In particular, tube rupture modeling methods, single tube modeling (STM) and double tube modeling (DTM), are compared. When five tubes are ruptured, the STM predicts the operator response time of 2085 seconds before main steam safety valves (MSSVs) are lifted. The effects of rupture location on the MSSV lift time is not significant in case of STM, but the MSSV lift time for tube-top rupture is found to be 25.3% larger than that for rupture at hog-leg side tube sheet in case of DTM. The MSSV lift time for the cases that both steam generators are affected (4C5x, 4C23x) are found to be larger than that of the single steam generator cases (4A5x, 4B5x) due to a bifurcation of the primary leak flow. The discharge coefficient of Cd is found to affect the MSSV lift time only for smaller value of 0.5. It is found that the most dominant parameter governing the MSSV lift time is the leak flow rate. Whether any modeling method is used, it gives the similar MSSV lift time if the leak flow rate is close, except the case of both steam generators are affected. Therefore, the system performance and the MSSV lift time of the APR1400 are strongly dependent on the break flow model used in the best estimate system code. (authors)

  20. 'HESPERIA' HORIZON 2020 project: High Energy Solar Particle Events foRecastIng and Analysis

    NASA Astrophysics Data System (ADS)

    Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma; Bindi, Veronica; Murphy, Ronald; Tyka, Allan J.; Rodriguez, Juan

    2016-04-01

    Solar energetic particles (SEPs) are of prime interest for fundamental astrophysics. However, due to their high energies they are a space weather concern for technology in space as well as human space exploration calling for reliable tools with predictive capabilities. The two-year EU HORIZON 2020 project HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis, http://www.hesperia-space.eu/) will produce two novel operational SEP forecasting tools based upon proven concepts (UMASEP, REleASE). At the same time the project will advance our understanding of the physical mechanisms that result into high-energy SEP events through the systematic exploitation of the high-energy gamma-ray observations of the FERMI mission and other novel published datasets (PAMELA, AMS), together with in situ SEP measurements near 1 AU. By using multi-frequency observations and performing simulations, the project will address the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to their detection near 1 AU. Furthermore, HESPERIA will explore the possibility of incorporating the derived results into future innovative space weather services. Publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters, giving information on the high-energy processes occurring at or near the Sun during solar eruptions, will be provided for the first time. The results of this inversion software will complement the space-borne measurements at adjacent higher energies. In order to achieve these goals HESPERIA will exploit already existing large datasets that are stored into databases built under EU FP7 projects NMDB and SEPServer. The structure of the HESPERIA project, its main objectives and forecasting operational tools, as well as the added value to SEP research will be presented and discussed. Acknowledgement: This project has received funding from the

  1. Metamizole-Associated Adverse Events: A Systematic Review and Meta-Analysis

    PubMed Central

    Fässler, Margrit; Blozik, Eva; Linde, Klaus; Jüni, Peter; Reichenbach, Stephan; Scherer, Martin

    2015-01-01

    Background Metamizole is used to treat pain in many parts of the world. Information on the safety profile of metamizole is scarce; no conclusive summary of the literature exists. Objective To determine whether metamizole is clinically safe compared to placebo and other analgesics. Methods We searched CENTRAL, MEDLINE, EMBASE, CINAHL, and several clinical trial registries. We screened the reference lists of included trials and previous systematic reviews. We included randomized controlled trials that compared the effects of metamizole, administered to adults in any form and for any indication, to other analgesics or to placebo. Two authors extracted data regarding trial design and size, indications for pain medication, patient characteristics, treatment regimens, and methodological characteristics. Adverse events (AEs), serious adverse events (SAEs), and dropouts were assessed. We conducted separate meta-analyses for each metamizole comparator, using standard inverse-variance random effects meta-analysis to pool the estimates across trials, reported as risk ratios (RRs). We calculated the DerSimonian and Laird variance estimate T2 to measure heterogeneity between trials. The pre-specified primary end point was any AE during the trial period. Results Of the 696 potentially eligible trials, 79 trials including almost 4000 patients with short-term metamizole use of less than two weeks met our inclusion criteria. Fewer AEs were reported for metamizole compared to opioids, RR = 0.79 (confidence interval 0.79 to 0.96). We found no differences between metamizole and placebo, paracetamol and NSAIDs. Only a few SAEs were reported, with no difference between metamizole and other analgesics. No agranulocytosis or deaths were reported. Our results were limited by the mediocre overall quality of the reports. Conclusion For short-term use in the hospital setting, metamizole seems to be a safe choice when compared to other widely used analgesics. High-quality, adequately sized

  2. Momentum Budget Analysis of Westerly Wind Events Associated with the Madden-Julian Oscillation during DYNAMO

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Oh, J. H.; Waliser, D. E.; Moncrieff, M. W.; Johnson, R. H.; Ciesielski, P. E.

    2015-12-01

    The Dynamics of the Madden-Julian Oscillation (DYNAMO) field campaign was conducted over the Indian Ocean (IO) from October 2011 to February 2012 to investigate the initiation of the Madden-Julian Oscillation (MJO). Three MJOs accompanying one or more westerly wind events (WWEs) occurred in late October, late November, and late December 2011, respectively. Momentum budget analysis is conducted in this study to understand the contributions of the dynamical processes involved in the wind evolution associated with the MJO active phases over the IO during DYNAMO using European Centre for Medium-Range Weather Forecasts (ECMWF) analysis. This analysis shows that westerly acceleration at lower levels associated with the MJO active phase generally appears to be maintained by the pressure gradient force (PGF), which is partly canceled by meridional advection of the zonal wind. Westerly acceleration in the mid-troposphere is mostly attributable to vertical advection. In addition, the MJO in late November (MJO2), accompanied by two different WWEs (WWE1, WWE2) spaced a few days apart, is further diagnosed. Unlike other WWEs during DYNAMO, horizontal advection is more responsible for the westerly acceleration in the lower troposphere for the WWE2 than the PGF. Interactions between the MJO2 convective envelope and convectively coupled waves (CCWs) have been further analyzed to illuminate the dynamical contribution of these synoptic scale equatorial waves to the WWEs during MJO2. We suggest that differences in the developing processes among WWEs can be attributed to the different types of CCWs.The Dynamics of the Madden-Julian Oscillation (DYNAMO) field campaign was conducted over the Indian Ocean (IO) from October 2011 to February 2012 to investigate the initiation of the Madden-Julian Oscillation (MJO). Three MJOs accompanying one or more westerly wind events (WWEs) occurred in late October, late November, and late December 2011, respectively. Momentum budget analysis is

  3. Retrospective analysis of never events in panniculectomy and abdominoplasty patients and their financial implications.

    PubMed

    Champaneria, Manish C; Workman, Adrienne D; Pham, Anh Tuan; Adetayo, Oluwaseun A; Gupta, Subhas C

    2014-10-01

    In 2008, the Centers for Medicare and Medicaid Service adapted a list from the National Quality Forum consisting of 10 hospital-acquired conditions, also known as never events. Deeming such events as preventable in a safe-hospital setting, reimbursement is no longer provided for treatments arising secondary to these events. A retrospective chart review identified 90 panniculectomy and abdominoplasty patients. The hospital-acquired conditions examined include surgical-site infections (SSI), vascular-catheter associated infections, deep venous thrombosis/pulmonary embolism, retained foreign body, catheter-related urinary tract infection, manifestations of poor glycemic control, falls and trauma, air embolism, pressure ulcers (stages III and IV), and blood incompatibility. Information regarding age, American Society of Anesthesiologists (ASA) classification, body mass index, smoking, and chemotherapy were collected. Patients were divided into 2 groups, namely, those who developed never events and those with no events. Of the 90 patients, 14 (15.5%) developed never events because of SSI. No events occurred in the remaining 9 categories. Statistically significant risk factors included American Society of Anesthesiologists classification, age, and diabetes mellitus. The most common never event was SSI. In light of the obvious prevalence of the risk factors in patients who develop these events, the question of whether never events are truly unavoidable arises. Despite this, awareness of the impact on patient care, health care and hospital reimbursement is vital to understanding the new paradigm of the "one size fits all."

  4. Conceptualizing the impact of special events on community health service levels: an operational analysis.

    PubMed

    Lund, Adam; Turris, Sheila A; Bowles, Ron

    2014-10-01

    Mass gatherings (MG) impact their host and surrounding communities and with inadequate planning, may impair baseline emergency health services. Mass gatherings do not occur in a vacuum; they have both consumptive and disruptive effects that extend beyond the event itself. Mass gatherings occur in real geographic locations that include not only the event site, but also the surrounding neighborhoods and communities. In addition, the impact of small, medium, or large special events may be felt for days, or even months, prior to and following the actual events. Current MG reports tend to focus on the events themselves during published event dates and may underestimate the full impact of a given MG on its host community. In order to account for, and mitigate, the full effects of MGs on community health services, researchers would benefit from a common model of community impact. Using an operations lens, two concepts are presented, the "vortex" and the "ripple," as metaphors and a theoretical model for exploring the broader impact of MGs on host communities. Special events and MGs impact host communities by drawing upon resources (vortex) and by disrupting normal, baseline services (ripple). These effects are felt with diminishing impact as one moves geographically further from the event center, and can be felt before, during, and after the event dates. Well executed medical and safety plans for events with appropriate, comprehensive risk assessments and stakeholder engagement have the best chance of ameliorating the potential negative impact of MGs on communities.

  5. Analysis of the observed and forecast rainfall intensity structure in a precipitation event

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Molinié, Gilles; Karakasidis, Theodoros; Anquentin, Sandrine; Creutin, Jean Dominique; Pinty, Jean-Pierre; Escobar, Juan

    2014-05-01

    During the last decades a number of studies have been devoted to examine the precipitation field temporal and spatial structure, given the fact that rainfall exhibits large variability at all scales (see for example Ceresetti et al. 2011, 2012). The objective of this study is to examine the rainfall field structure at high temporal (15 minute) and spatial (1 km) resolution. We focus on rainfall properties such as the intermittency using the auto-correlation of precipitation time series to assess if it can be modelled assuming a fractal behaviour and considering different scales. Based on the results and methodology used in previous studies applied to observational precipitation data such as raingauge, weather radar and disdrometer observations (see for example Molinié et al., 2011, 2013), in this case we employ high resolution numerical forecast data. In particular our approach considers using a transitive covariogram, given the limited number of samples available in single precipitation events. Precipitation forecasts are derived at 15 minute intervals from 1-km grid length nested simulations of the non-hydrostatic mesoscale atmospheric model of the French research community Meso-NH, using AROME-WestMed model data as initial and boundary conditions. The analysis also considers existing data available in the Hymex (HYdrological cycle in the Mediterranean EXperiment) data base. Results are presented of a precipitation event that took place in the Rhône Valley (France) in November 2011. This case allows to study with the proposed methodology the effect of a number of factors (different orography along the Rhône Valley, turbulence, microphysical processes, etc.) on the observed and simulated precipitation field. References Ceresetti D., E. Ursu, J. Carreau, S. Anquetin, J. D. Creutin, L. Gardes, S. Girard, and G. Molinié, 2012: Evaluation of classical spatial-analysis schemes of extreme rainfall. Natural Hazards and Earth System Sciences, 12, 3229-3240, http

  6. Teleradiology system analysis using a discrete event-driven block-oriented network simulator

    NASA Astrophysics Data System (ADS)

    Stewart, Brent K.; Dwyer, Samuel J., III

    1992-07-01

    Performance evaluation and trade-off analysis are the central issues in the design of communication networks. Simulation plays an important role in computer-aided design and analysis of communication networks and related systems, allowing testing of numerous architectural configurations and fault scenarios. We are using the Block Oriented Network Simulator (BONeS, Comdisco, Foster City, CA) software package to perform discrete, event- driven Monte Carlo simulations in capacity planning, tradeoff analysis and evaluation of alternate architectures for a high-speed, high-resolution teleradiology project. A queuing network model of the teleradiology system has been devise, simulations executed and results analyzed. The wide area network link uses a switched, dial-up N X 56 kbps inverting multiplexer where the number of digital voice-grade lines (N) can vary from one (DS-0) through 24 (DS-1). The proposed goal of such a system is 200 films (2048 X 2048 X 12-bit) transferred between a remote and local site in an eight hour period with a mean delay time less than five minutes. It is found that: (1) the DS-1 service limit is around 100 films per eight hour period with a mean delay time of 412 +/- 39 seconds, short of the goal stipulated above; (2) compressed video teleconferencing can be run simultaneously with image data transfer over the DS-1 wide area network link without impacting the performance of the described teleradiology system; (3) there is little sense in upgrading to a higher bandwidth WAN link like DS-2 or DS-3 for the current system; and (4) the goal of transmitting 200 films in an eight hour period with a mean delay time less than five minutes can be achieved simply if the laser printer interface is updated from the current DR-11W interface to a much faster SCSI interface.

  7. Adverse Events After Radiofrequency Ablation in Patients With Barrett's Esophagus: A Systematic Review and Meta-analysis.

    PubMed

    Qumseya, Bashar J; Wani, Sachin; Desai, Madhav; Qumseya, Amira; Bain, Paul; Sharma, Prateek; Wolfsen, Herbert

    2016-08-01

    Radiofrequency ablation (RFA) with or without endoscopic mucosal resection (EMR) is routinely used for treatment of Barrett's esophagus with dysplasia. Despite the relative safety of this method, there have been imprecise estimates of the rate of adverse events. We performed a systematic review and meta-analysis to assess the rate of adverse events associated with RFA with and without EMR. We searched MEDLINE, Embase, Web of Science, and Cochrane Central through October 22, 2014. The primary outcome of interest was the overall rate of adverse events after RFA with or without EMR. We used forest plots to contrast effect sizes among studies. Of 1521 articles assessed, 37 met our inclusion criteria (comprising 9200 patients). The pooled rate of all adverse events from RFA with or without EMR was 8.8% (95% confidence interval [CI], 6.5%-11.9%); 5.6% of patients developed strictures (95% CI, 4.2%-7.4%), 1% had bleeding (95% CI, 0.8%-1.3%), and 0.6% developed a perforation (95% CI, 0.4%-0.9%). In studies that compared RFA with vs without EMR, the relative risk for adverse events was significantly higher for RFA with EMR (4.4) (P = .015). There was a trend toward higher proportions of adverse events in prospective studies compared with retrospective studies (11.3% vs 7.8%, P = .20). Other factors associated with adverse events included Barrett's esophagus and length and baseline histology. In a systematic review and meta-analysis, we found the relative risk for adverse events from RFA to be about 4-fold higher with EMR than without; we identified factors associated with these events. Endoscopists should discuss these risks with patients before endoscopic eradication therapy. Copyright © 2016. Published by Elsevier Inc.

  8. An analysis of high-impact, low-predictive skill severe weather events in the northeast U.S

    NASA Astrophysics Data System (ADS)

    Vaughan, Matthew T.

    An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather events with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill events between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather events with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill events are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill events. Type 1 events were found to occur in every year of the dataset with an average of 6 events per year. Type 2 events occur less frequently and are more common in the earlier half of the study period. An event-centered composite analysis is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather events with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact events with less than 31-kt shear have lower threat scores than high-impact events with higher shear values. Case study analysis of type 1 events suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.

  9. The ERP PCA Toolkit: an open source program for advanced statistical analysis of event-related potential data.

    PubMed

    Dien, Joseph

    2010-03-15

    This article presents an open source Matlab program, the ERP PCA (EP) Toolkit, for facilitating the multivariate decomposition and analysis of event-related potential data. This program is intended to supplement existing ERP analysis programs by providing functions for conducting artifact correction, robust averaging, referencing and baseline correction, data editing and visualization, principal components analysis, and robust inferential statistical analysis. This program subserves three major goals: (1) optimizing analysis of noisy data, such as clinical or developmental; (2) facilitating the multivariate decomposition of ERP data into its constituent components; (3) increasing the transparency of analysis operations by providing direct visualization of the corresponding waveforms.

  10. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  11. The analysis of the events of stellar visibility in Pliny's "Natural History"

    NASA Astrophysics Data System (ADS)

    Nickiforov, M. G.

    2016-07-01

    The Book XVIII of Pliny's "Natural History" contains about a hundred descriptions of the events of stellar visibility, which were used for the needs of agricultural calendar. The comparison between the calculated date of each event and the date given by Pliny shows that actual events of stellar visibility occurred systematically about ~10 days later with respect to the specified time. This discrepancy cannot be explained by errors of the calendar.

  12. An Internal Evaluation of the National FFA Agricultural Mechanics Career Development Event through Analysis of Individual and Team Scores from 1996-2006

    ERIC Educational Resources Information Center

    Franklin, Edward A.; Armbruster, James

    2012-01-01

    The purpose of this study was to conduct an internal evaluation of the National FFA Agricultural Mechanics Career Development Event (CDE) through analysis of individual and team scores from 1996-2006. Data were analyzed by overall and sub-event areas scores for individual contestants and team event. To facilitate the analysis process scores were…

  13. Prediction of clinical risks by analysis of preclinical and clinical adverse events.

    PubMed

    Clark, Matthew

    2015-04-01

    This study examines the ability of nonclinical adverse event observations to predict human clinical adverse events observed in drug development programs. In addition it examines the relationship between nonclinical and clinical adverse event observations to drug withdrawal and proposes a model to predict drug withdrawal based on these observations. These analyses provide risk assessments useful for both planning patient safety programs, as well as a statistical framework for assessing the future success of drug programs based on nonclinical and clinical observations. Bayesian analyses were undertaken to investigate the connection between nonclinical adverse event observations and observations of that same event in clinical trial for a large set of approved drugs. We employed the same statistical methods used to evaluate the efficacy of diagnostic tests to evaluate the ability of nonclinical studies to predict adverse events in clinical studies, and adverse events in both to predict drug withdrawal. We find that some nonclinical observations suggest higher risk for observing the same adverse event in clinical studies, particularly arrhythmias, QT prolongation, and abnormal hepatic function. However the lack of these events in nonclinical studies is found to not be a good predictor of safety in humans. Some nonclinical and clinical observations appear to be associated with high risk of drug withdrawal from market, especially arrhythmia and hepatic necrosis. We use the method to estimate the overall risk of drug withdrawal from market using the product of the risks from each nonclinical and clinical observation to create a risk profile. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    NASA Astrophysics Data System (ADS)

    Lilly, Jonathan M.

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized `events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's `region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  15. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam

    2016-04-01

    Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic events. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an analysis of post-blasting events using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on