Science.gov

Sample records for event analysis atheana

  1. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    SciTech Connect

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  2. Human Events Reference for ATHEANA (HERA) Database Description and Preliminary User's Manual

    SciTech Connect

    Auflick, J.L.

    1999-08-12

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database (db) of analytical operational events, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  3. Human events reference for ATHEANA (HERA) database description and preliminary user`s manual

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Pond, D.J.

    1998-05-27

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  4. Discussion of Comments from a Peer Review of A Technique for Human Event Anlysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A,; Wreathall J.

    1999-01-28

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of draft NUREG- 1624, a peer review of the new human reliability analysis method its documentation and the results of an initial test of the method was held over a two-day period in June 1998 in Seattle, Washington. Four internationally known and respected experts in HK4 or probabilistic risk assessment were selected to serve as the peer reviewers. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. They were asked to focus on two major aspects: Are the basic premises of ATHEANA on solid ground and is the conceptual basis adequate? Is the ATHEANA implementation process adequate given the description of the intended users in the documentation? The four peer reviewers asked questions and provided oral comments during the peer review meeting and provided written comments approximately two weeks after the completion of the meeting. This paper discusses their major comments.

  5. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  6. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Whitehead, D.W.; Forester, J.A.; Bley, D.C.

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  7. Philosophy of ATHEANA

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A.; Thompson, C.M.; Whitehead, D.W.; Wreathall, J.

    1999-03-24

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  8. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  9. A process for application of ATHEANA - a new HRA method

    SciTech Connect

    Parry, G.W.; Bley, D.C.; Cooper, S.E.

    1996-10-01

    This paper describes the analytical process for the application of ATHEANA, a new approach to the performance of human reliability analysis as part of a PRA. This new method, unlike existing methods, is based upon an understanding of the reasons why people make errors, and was developed primarily to address the analysis of errors of commission.

  10. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    SciTech Connect

    Taylor, J.H.; Luckas, W.J.; Wreathall, J.

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  11. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    SciTech Connect

    Taylor, J.; O`Hara, J.; Luckas, W.

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  12. EVENT PLANNING USING FUNCTION ANALYSIS

    SciTech Connect

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  13. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  14. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  15. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  16. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  17. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  18. Top Event Matrix Analysis Code System.

    Energy Science and Technology Software Center (ESTSC)

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  19. Cluster Analysis for CTBT Seismic Event Monitoring

    SciTech Connect

    Carr, Dorthe B.; Young, Chris J.; Aster, Richard C.; Zhang, Xioabing

    1999-08-03

    , respectively. The clustering techniques prove to be much more effective for the New Mexico data than the Wyoming data, apparently because the New Mexico mines are closer and consequently the signal to noise ratios (SNR's) for those events are higher. To verify this hypothesis we experiment with adding gaussian noise to the New Mexico data to simulate data from more distant sites. Our results suggest that clustering techniques can be very useful for identifying small anomalous events if at least one good recording is available, and that the only reliable way to improve clustering results is to process the waveforms to improve SNR. For events with good SNR that do have strong grouping, cluster analysis will reveal the inherent groupings regardless of the choice of clustering method.

  20. Dynamic Event Tree Analysis Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  1. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  2. Automated analysis of failure event data

    SciTech Connect

    HENNESSY,COREY; FREERKS,FRED; CAMPBELL,JAMES E.; THOMPSON,BRUCE M.

    2000-03-27

    This paper focuses on fully automated analysis of failure event data in the concept and early development stage of a semiconductor-manufacturing tool. In addition to presenting a wide range of statistical and machine-specific performance information, algorithms have been developed to examine reliability growth and to identify major contributors to unreliability. These capabilities are being implemented in a new software package called Reliadigm. When coupled with additional input regarding repair times and parts availability, the analysis software also provides spare parts inventory optimization based on genetic optimization methods. The type of question to be answered is: If this tool were placed with a customer for beta testing, what would be the optimal spares kit to meet equipment reliability goals for the lowest cost? The new algorithms are implemented in Windows{reg_sign} software and are easy to apply. This paper presents a preliminary analysis of failure event data from three IDEA machines currently in development. The paper also includes an optimal spare parts kit analysis.

  3. Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.

  4. Analysis of a Limb Eruptive Event

    NASA Astrophysics Data System (ADS)

    Kotrč, P. Kupryakov, Yu. A.; Bárta, M.; Kashapova, K., L.; Liu, W.

    2016-04-01

    We present the analysis of an eruptive event that took place on the eastern limb on April 21, 2015, which was observed by the Ondřejov horizontal telescope and spectrograph. The eruption of the highly twisted prominence was followed by the onset of soft X-ray sources. We identified the structures observed in Hα spectra with the details on the Hα filtergrams and analyzed the evolution of Doppler component velocities. The timing and observed characteristics of the eruption were compared with the prediction of the model based on the twisting of the flux ropes and the kink/torus instability.

  5. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  7. Analysis of a California Catalina eddy event

    NASA Technical Reports Server (NTRS)

    Bosart, L. F.

    1983-01-01

    During the period 26-29 May 1968 a shallow cyclonic circulation, known locally as a Catalina eddy, developed in the offshore waters of southern California. A synoptic and mesoscale analysis of the event establishes the following: (1) the incipient circulation forms on the coast near Santa Barbara downwind of the coastal mountains, (2) cyclonic shear vorticity appears offshore in response to lee troughing downstream of the coastal mountains between Vandenberg and Pt. Mugu, California, (3) mountain wave activity may be aiding incipient eddy formation in association with synoptic-scale subsidence and the generation of a stable layer near the crest of the coastal mountains, (4) a southeastward displacement and offshore expansion of the circulation occurs following the passage of the synoptic-scale ridge line, and (5) dissipation of the eddy occurs with the onset of a broad onshore flow.

  8. Sisyphus - An Event Log Analysis Toolset

    SciTech Connect

    Jon Stearley, Glenn Laguna

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiy understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.

  9. Sisyphus - An Event Log Analysis Toolset

    Energy Science and Technology Software Center (ESTSC)

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less

  10. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  11. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  12. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  13. Modelling recurrent events: a tutorial for analysis in epidemiology

    PubMed Central

    Amorim, Leila DAF; Cai, Jianwen

    2015-01-01

    In many biomedical studies, the event of interest can occur more than once in a participant. These events are termed recurrent events. However, the majority of analyses focus only on time to the first event, ignoring the subsequent events. Several statistical models have been proposed for analysing multiple events. In this paper we explore and illustrate several modelling techniques for analysis of recurrent time-to-event data, including conditional models for multivariate survival data (AG, PWP-TT and PWP-GT), marginal means/rates models, frailty and multi-state models. We also provide a tutorial for analysing such type of data, with three widely used statistical software programmes. Different approaches and software are illustrated using data from a bladder cancer project and from a study on lower respiratory tract infection in children in Brazil. Finally, we make recommendations for modelling strategy selection for analysis of recurrent event data. PMID:25501468

  14. Modelling recurrent events: a tutorial for analysis in epidemiology.

    PubMed

    Amorim, Leila D A F; Cai, Jianwen

    2015-02-01

    In many biomedical studies, the event of interest can occur more than once in a participant. These events are termed recurrent events. However, the majority of analyses focus only on time to the first event, ignoring the subsequent events. Several statistical models have been proposed for analysing multiple events. In this paper we explore and illustrate several modelling techniques for analysis of recurrent time-to-event data, including conditional models for multivariate survival data (AG, PWP-TT and PWP-GT), marginal means/rates models, frailty and multi-state models. We also provide a tutorial for analysing such type of data, with three widely used statistical software programmes. Different approaches and software are illustrated using data from a bladder cancer project and from a study on lower respiratory tract infection in children in Brazil. Finally, we make recommendations for modelling strategy selection for analysis of recurrent event data. PMID:25501468

  15. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  16. The flood event explorer - a web based framework for rapid flood event analysis

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Lüdtke, Stefan; Kreibich, Heidi; Merz, Bruno

    2015-04-01

    Flood disaster management, recovery and reconstruction planning benefit from rapid evaluations of flood events and expected impacts. The near real time in-depth analysis of flood causes and key drivers for flood impacts requires a close monitoring and documentation of hydro-meteorological and socio-economic factors. Within the CEDIM's Rapid Flood Event Analysis project a flood event analysis system is developed which enables the near real-time evaluation of large scale floods in Germany. The analysis system includes functionalities to compile event related hydro-meteorological data, to evaluate the current flood situation, to assess hazard intensity and to estimate flood damage to residential buildings. A German flood event database is under development, which contains various hydro-meteorological information - in the future also impact information -for all large-scale floods since 1950. This data base comprises data on historic flood events which allow the classification of ongoing floods in terms of triggering processes and pre-conditions, critical controls and drivers for flood losses. The flood event analysis system has been implemented in a database system which automatically retrieves and stores data from more than 100 online discharge gauges on a daily basis. The current discharge observations are evaluated in a long term context in terms of flood frequency analysis. The web-based frontend visualizes the current flood situation in comparison to any past flood from the flood catalogue. The regional flood data base for Germany contains hydro-meteorological data and aggregated severity indices for a set of 76 historic large-scale flood events in Germany. This data base has been used to evaluate the key drivers for the flood in June 2013.

  17. Multiscale analysis of a sustained precipitation event

    NASA Technical Reports Server (NTRS)

    Knupp, Kevin R.; Williams, Steven F.

    1987-01-01

    Mesoscale data collected during both the satellite precipitation and cloud experiment and the microburst and severe thunderstorm program are analyzed in order to describe features associated with two distinct mesoscale precipitation events that occurred about 10 hours apart over the region of northern Alabama to central Tennessee in June 1986. Data sets used include mesobeta-scale rawinsonde data, surface mesonet data, RADAP data, and GOES images. The present mesoscale environment involved the merger of Hurricane Bonnie remnants with a preexisting midlatitude short-wave trough.

  18. Statistical Analysis of Small Ellerman Bomb Events

    NASA Astrophysics Data System (ADS)

    Nelson, C. J.; Doyle, J. G.; Erdélyi, R.; Huang, Z.; Madjarska, M. S.; Mathioudakis, M.; Mumford, S. J.; Reardon, K.

    2013-04-01

    The properties of Ellerman bombs (EBs), small-scale brightenings in the Hα line wings, have proved difficult to establish because their size is close to the spatial resolution of even the most advanced telescopes. Here, we aim to infer the size and lifetime of EBs using high-resolution data of an emerging active region collected using the Interferometric BIdimensional Spectrometer (IBIS) and Rapid Oscillations of the Solar Atmosphere (ROSA) instruments as well as the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO). We develop an algorithm to track EBs through their evolution, finding that EBs can often be much smaller (around 0.3″) and shorter-lived (less than one minute) than previous estimates. A correlation between G-band magnetic bright points and EBs is also found. Combining SDO/HMI and G-band data gives a good proxy of the polarity for the vertical magnetic field. It is found that EBs often occur both over regions of opposite polarity flux and strong unipolar fields, possibly hinting at magnetic reconnection as a driver of these events.The energetics of EB events is found to follow a power-law distribution in the range of a nanoflare (1022-25 ergs).

  19. [Analysis of Spontaneously Reported Adverse Events].

    PubMed

    Nakamura, Mitsuhiro

    2016-01-01

    Observational study is necessary for the evaluation of drug effectiveness in clinical practice. In recent years, the use of spontaneous reporting systems (SRS) for adverse drug reactions has increased and they have become an important resource for regulatory science. SRS, being the largest and most well-known databases worldwide, are one of the primary tools used for postmarketing surveillance and pharmacovigilance. To analyze SRS, the US Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report Database (JADER) are reviewed. Authorized pharmacovigilance algorithms were used for signal detection, including the reporting odds ratio. An SRS is a passive reporting database and is therefore subject to numerous sources of selection bias, including overreporting, underreporting, and a lack of a denominator. Despite the inherent limitations of spontaneous reporting, SRS databases are a rich resource and data mining index that provide powerful means of identifying potential associations between drugs and their adverse effects. Our results, which are based on the evaluation of SRS databases, provide essential knowledge that could improve our understanding of clinical issues. PMID:27040337

  20. Dynamic Modelling and Statistical Analysis of Event Times

    PubMed Central

    Peña, Edsel A.

    2006-01-01

    This review article provides an overview of recent work in the modelling and analysis of recurrent events arising in engineering, reliability, public health, biomedical, and other areas. Recurrent event modelling possesses unique facets making it different and more difficult to handle than single event settings. For instance, the impact of an increasing number of event occurrences needs to be taken into account, the effects of covariates should be considered, potential association among the inter-event times within a unit cannot be ignored, and the effects of performed interventions after each event occurrence need to be factored in. A recent general class of models for recurrent events which simultaneously accommodates these aspects is described. Statistical inference methods for this class of models are presented and illustrated through applications to real data sets. Some existing open research problems are described. PMID:17906740

  1. Seismic Initiating Event Analysis For a PBMR Plant

    SciTech Connect

    Van Graan, Henriette; Serbanescu, Dan; Combrink, Yolanda; Coman, Ovidiu

    2004-07-01

    Seismic Initiating Event (IE) analysis is one of the most important tasks that control the level of effort and quality of the whole Seismic Probabilistic Safety Assessment (SPRA). The typical problems are related to the following aspects: how the internal PRA model and its complexity can be used and how to control the number of PRA components for which fragility evaluation should be performed and finally to obtain a manageable number of significant cut-sets for seismic risk quantification. The answers to these questions are highly dependent on the possibility to improve the interface between the internal events analysis and the external events analysis at the design stage. (authors)

  2. Peak event analysis: a novel empirical method for the evaluation of elevated particulate events

    PubMed Central

    2013-01-01

    Background We report on a novel approach to the analysis of suspended particulate data in a rural setting in southern Ontario. Analyses of suspended particulate matter and associated air quality standards have conventionally focussed on 24-hour mean levels of total suspended particulates (TSP) and particulate matter <10 microns, <2.5 microns and <1 micron in diameter (PM10, PM2.5, PM1, respectively). Less emphasis has been placed on brief peaks in suspended particulate levels, which may pose a substantial nuisance, irritant, or health hazard. These events may also represent a common cause of public complaint and concern regarding air quality. Methods Measurements of TSP, PM10, PM2.5, and PM1 levels were taken using an automated device following local complaints of dusty conditions in rural south-central Ontario, Canada. The data consisted of 126,051 by-minute TSP, PM10, PM2.5, and PM1 measurements between May and August 2012. Two analyses were performed and compared. First, conventional descriptive statistics were computed by month for TSP, PM10, PM2.5, and PM1, including mean values and percentiles (70th, 90th, and 95th). Second, a novel graphical analysis method, using density curves and line plots, was conducted to examine peak events occurring at or above the 99th percentile of per-minute TSP readings. We refer to this method as “peak event analysis”. Findings of the novel method were compared with findings from the conventional approach. Results Conventional analyses revealed that mean levels of all categories of suspended particulates and suspended particulate diameter ratios conformed to existing air quality standards. Our novel methodology revealed extreme outlier events above the 99th percentile of readings, with peak PM10 and TSP levels over 20 and 100 times higher than the respective mean values. Peak event analysis revealed and described rare and extreme peak dust events that would not have been detected using conventional descriptive statistics

  3. Event/Time/Availability/Reliability-Analysis Program

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas

    1994-01-01

    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  4. Event-Synchronous Analysis for Connected-Speech Recognition.

    NASA Astrophysics Data System (ADS)

    Morgan, David Peter

    The motivation for event-synchronous speech analysis originates from linear system theory where the speech-source transfer function is excited by an impulse-like driving function. In speech processing, the impulse response obtained from this linear system contains both semantic information and the vocal tract transfer function. Typically, an estimate of the transfer function is obtained via the spectrum by assuming a short-time stationary signal within some analysis window. However, this spectrum is often distorted by the periodic effects which occur when multiple (pitch) impulses are included in the analysis window. One method to remove these effects would be to deconvolve the excitation function from the speech signal to obtain the transfer function. The more attractive approach is to locate and identify the excitation function and synchronize the analysis frame with it. Event-synchronous analysis differs from pitch -synchronous analysis in that there are many events useful for speech recognition which are not pitch excited. In addition, event-synchronous analysis locates the important boundaries between speech events, such as voiced to unvoiced and silence to burst transitions. In asynchronous processing, an analysis frame which contains portions of two adjacent but dissimilar speech events is often so ambiguous as to distort or mask the important "phonetic" features of both events. Thus event-syncronous processing is employed to obtain an accurate spectral estimate and in turn enhance the estimate of the vocal-tract transfer function. Among the issues which have been addressed in implementing an event-synchronous recognition system are those of developing robust event (pitch, burst, etc.) detectors, synchronous-analysis methodologies, more meaningful feature sets, and dynamic programming algorithms for nonlinear time alignment. An advantage of event-synchronous processing is that the improved representation of the transfer function creates an opportunity for

  5. Statistical issues in the analysis of adverse events in time-to-event data.

    PubMed

    Allignol, Arthur; Beyersmann, Jan; Schmoor, Claudia

    2016-07-01

    The aim of this work is to shed some light on common issues in the statistical analysis of adverse events (AEs) in clinical trials, when the main outcome is a time-to-event endpoint. To begin, we show that AEs are always subject to competing risks. That is, the occurrence of a certain AE may be precluded by occurrence of the main time-to-event outcome or by occurrence of another (fatal) AE. This has raised concerns on 'informative' censoring. We show that, in general, neither simple proportions nor Kaplan-Meier estimates of AE occurrence should be used, but common survival techniques for hazards that censor the competing event are still valid, but incomplete analyses. They must be complemented by an analogous analysis of the competing event for inference on the cumulative AE probability. The commonly used incidence rate (or incidence density) is a valid estimator of the AE hazard assuming it to be time constant. An estimator of the cumulative AE probability can be derived if the incidence rate of AE is combined with an estimator of the competing hazard. We discuss less restrictive analyses using non-parametric and semi-parametric approaches. We first consider time-to-first-AE analyses and then briefly discuss how they can be extended to the analysis of recurrent AEs. We will give a practical presentation with illustration of the methods by a simple example. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26929180

  6. Event-based Recession Analysis across Scales

    NASA Astrophysics Data System (ADS)

    Chen, B.; Krajewski, W. F.

    2012-12-01

    Hydrograph recessions have long been a window to investigate hydrological processes and their interactions. The authors conducted an exploratory analysis of about 1000 individual hydrograph recessions in a period of around 15 years (1995-2010) from time series of hourly discharge (USGS IDA stream flow data set) at 27 USGS gauges located in Iowa and Cedar River basins with drainage area ranging from 6.7 to around 17000 km2. They calculated recession exponents with the same recession length but different time lags from the hydrograph peak ranging from ~0 to 96 hours, and then plotted them against time lags to construct the evolution of recession exponent. The result shows that, as recession continues, the recession exponent in first increases quickly, then decreases quickly, and finally stays constant. Occasionally and for different reasons, the decreasing portion is missing due to negligible contribution from soil water storage. The increasing part of the evolution of can be related to fast response to rainfall including overland flow and quick subsurface flow through macropores (or tiles), and the decreasing portion can be connected to the delayed soil water response. Lastly, the constant segment can be attributed to the groundwater storage with the slowest response. The points where recession exponent reaches its maximum and begins to plateau are the times that fast response and soil water response end, respectively. The authors conducted further theoretical analysis by combining mathematical derivation and literature results to explain the observed evolution path of the recession exponent . Their results have a direct application in hydrograph separation and important implications for dynamic basin storage-discharge relation analysis and hydrological process understanding across scales.

  7. Event shape analysis of deep inelastic scattering events with a large rapidity gap at HERA

    NASA Astrophysics Data System (ADS)

    ZEUS Collaboration; Breitweg, J.; Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Yoshida, R.; Zhang, H.; Mattingly, M. C. K.; Anselmo, F.; Antonioli, P.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Cara Romeo, G.; Castellini, G.; Cifarelli, L.; Cindolo, F.; Contin, A.; Corradi, M.; de Pasquale, S.; Gialas, I.; Giusti, P.; Iacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Palmonari, F.; Pesci, A.; Polini, A.; Ricci, F.; Sartorelli, G.; Zamora Garcia, Y.; Zichichi, A.; Amelung, C.; Bornheim, A.; Brock, I.; Coböken, K.; Crittenden, J.; Deffner, R.; Eckert, M.; Grothe, M.; Hartmann, H.; Heinloth, K.; Heinz, L.; Hilger, E.; Jakob, H.-P.; Katz, U. F.; Kerger, R.; Paul, E.; Pfeiffer, M.; Rembser, Ch.; Stamm, J.; Wedemeyer, R.; Wieber, H.; Bailey, D. S.; Campbell-Robson, S.; Cottingham, W. N.; Foster, B.; Hall-Wilton, R.; Hayes, M. E.; Heath, G. P.; Heath, H. F.; McFall, J. D.; Piccioni, D.; Roff, D. G.; Tapper, R. J.; Arneodo, M.; Ayad, R.; Capua, M.; Garfagnini, A.; Iannotti, L.; Schioppa, M.; Susinno, G.; Kim, J. Y.; Lee, J. H.; Lim, I. T.; Pac, M. Y.; Caldwell, A.; Cartiglia, N.; Jing, Z.; Liu, W.; Mellado, B.; Parsons, J. A.; Ritz, S.; Sampson, S.; Sciulli, F.; Straub, P. B.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Figiel, J.; Klimek, K.; Przybycień , M. B.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Bukowy, M.; Jeleń , K.; Kisielewska, D.; Kowalski, T.; Przybycień , M.; Rulikowska-Zarȩ Bska, E.; Suszycki, L.; Zaja C, J.; Duliń Ski, Z.; Kotań Ski, A.; Abbiendi, G.; Bauerdick, L. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Cases, G.; Deppe, O.; Desler, K.; Drews, G.; Fricke, U.; Gilkinson, D. J.; Glasman, C.; Göttlicher, P.; Haas, T.; Hain, W.; Hasell, D.; Johnson, K. F.; Kasemann, M.; Koch, W.; Kötz, U.; Kowalski, H.; Labs, J.; Lindemann, L.; Löhr, B.; Löwe, M.; Mań Czak, O.; Milewski, J.; Monteiro, T.; Ng, J. S. T.; Notz, D.; Ohrenberg, K.; Park, I. H.; Pellegrino, A.; Pelucchi, F.; Piotrzkowski, K.; Roco, M.; Rohde, M.; Roldán, J.; Ryan, J. J.; Savin, A. A.; Schneekloth, U.; Selonke, F.; Surrow, B.; Tassi, E.; Voß, T.; Westphal, D.; Wolf, G.; Wollmer, U.; Youngman, C.; Zsolararnecki, A. F.; Zeuner, W.; Burow, B. D.; Grabosch, H. J.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Gallo, E.; Pelfer, P.; Maccarrone, G.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Markun, P.; Trefzger, T.; Wölfle, S.; Bromley, J. T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; MacDonald, N.; Saxon, D. H.; Sinclair, L. E.; Strickland, E.; Waugh, R.; Bohnet, I.; Gendner, N.; Holm, U.; Meyer-Larsen, A.; Salehi, H.; Wick, K.; Gladilin, L. K.; Horstmann, D.; Kçira, D.; Klanner, R.; Lohrmann, E.; Poelz, G.; Schott, W.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Cole, J. E.; Howell, G.; Hung, B. H. Y.; Lamberti, L.; Long, K. R.; Miller, D. B.; Pavel, N.; Prinias, A.; Sedgbeer, J. K.; Sideris, D.; Walker, R.; Mallik, U.; Wang, S. M.; Wu, J. T.; Cloth, P.; Filges, D.; Fleck, J. I.; Ishii, T.; Kuze, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamauchi, K.; Yamazaki, Y.; Hong, S. J.; Lee, S. B.; Nam, S. W.; Park, S. K.; Barreiro, F.; Fernández, J. P.; García, G.; Graciani, R.; Hernández, J. M.; Hervás, L.; Labarga, L.; Martínez, M.; del Peso, J.; Puga, J.; Terrón, J.; de Trocóniz, J. F.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Murray, W. N.; Ochs, A.; Riveline, M.; Stairs, D. G.; St-Laurent, M.; Ullmann, R.; Tsurugai, T.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Golubkov, Yu. A.; Khein, L. A.; Korotkova, N. A.; Korzhavina, I. A.; Kuzmin, V. A.; Lukina, O. Yu.; Proskuryakov, A. S.; Shcheglova, L. M.; Solomin, A. N.; Zotkin, S. A.; Bokel, C.; Botje, M.; Brümmer, N.; Chlebana, F.; Engelen, J.; Koffeman, E.; Kooijman, P.; van Sighem, A.; Tiecke, H.; Tuning, N.; Verkerke, W.; Vossebeld, J.; Vreeswijk, M.; Wiggers, L.; de Wolf, E.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Gilmore, J.; Ginsburg, C. M.; Kim, C. L.; Ling, T. Y.; Nylander, P.; Romanowski, T. A.; Blaikley, H. E.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Devenish, R. C. E.; Edmonds, J. K.; Große-Knetter, J.; Harnew, N.; Nath, C.; Noyes, V. A.; Quadt, A.; Ruske, O.; Tickner, J. R.; Uijterwaal, H.; Walczak, R.; Waters, D. S.; Bertolin, A.; Brugnera, R.; Carlin, R.; dal Corso, F.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Oh, B. Y.; Okrasiń Ski, J. R.; Toothacker, W. S.; Whitmore, J. J.; Iga, Y.; D'Agostini, G.; Marini, G.; Nigro, A.; Raso, M.; Hart, J. C.; McCubbin, N. A.; Shah, T. P.; Epperson, D.; Heusch, C.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Wichmann, R.; Williams, D. C.; Schwarzer, O.; Walenta, A. H.; Abramowicz, H.; Briskin, G.; Dagan, S.; Kananov, S.; Levy, A.; Abe, T.; Fusayasu, T.; Inuzuka, M.; Nagano, K.; Umemori, K.; Yamashita, T.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Matsushita, T.; Cirio, R.; Costa, M.; Ferrero, M. I.; Maselli, S.; Monaco, V.; Peroni, C.; Petrucci, M. C.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Fagerstroem, C.-P.; Galea, R.; Hartner, G. F.; Joo, K. K.; Levman, G. M.; Martin, J. F.; Orr, R. S.; Polenz, S.; Sabetfakhri, A.; Simmons, D.; Teuscher, R. J.; Butterworth, J. M.; Catterall, C. D.; Jones, T. W.; Lane, J. B.; Saunders, R. L.; Sutton, M. R.; Wing, M.; Ciborowski, J.; Grzelak, G.; Kasprzak, M.; Muchorowski, K.; Nowak, R. J.; Pawlak, J. M.; Pawlak, R.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Adamus, M.; Coldewey, C.; Eisenberg, Y.; Hochman, D.; Karshon, U.; Badgett, W. F.; Chapin, D.; Cross, R.; Dasu, S.; Foudas, C.; Loveless, R. J.; Mattingly, S.; Reeder, D. D.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Deshpande, A.; Dhawan, S.; Hughes, V. W.; Bhadra, S.; Frisken, W. R.; Khakzad, M.; Schmidke, W. B.

    1998-03-01

    A global event shape analysis of the multihadronic final states observed in neutral current deep inelastic scattering events with a large rapidity gap with respect to the proton direction is presented. The analysis is performed in the range 5<=Q2<=185 GeV2 and 160<=W<=250 GeV, where Q2 is the virtuality of the photon and W is the virtual-photon proton centre of mass energy. Particular emphasis is placed on the dependence of the shape variables, measured in the γ*-pomeron rest frame, on the mass of the hadronic final state, MX. With increasing MX the multihadronic final state becomes more collimated and planar. The experimental results are compared with several models which attempt to describe diffractive events. The broadening effects exhibited by the data require in these models a significant gluon component of the pomeron.

  8. Glaciological parameters of disruptive event analysis

    SciTech Connect

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated.

  9. A reference manual for the Event Progression Analysis Code (EVNTRE)

    SciTech Connect

    Griesmeyer, J.M.; Smith, L.N.

    1989-09-01

    This document is a reference guide for the Event Progression Analysis (EVNTRE) code developed at Sandia National Laboratories. EVNTRE is designed to process the large accident progression event trees and associated files used in probabilistic risk analyses for nuclear power plants. However, the general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. The EVNTRE code efficiently processes large, complex event trees. It has the capability to assign probabilities to event tree branch points in several different ways, to classify pathways or outcomes into user-specified groupings, and to sample input distributions of probabilities and parameters.

  10. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  11. Nonlinear Analysis for Event Forewarning (NLAfEW)

    Energy Science and Technology Software Center (ESTSC)

    2013-05-23

    The NLAfEW computer code analyses noisy, experimental data to forewarn of adverse events. The functionality of the analysis is a follows: It removes artifacts from the data, converts the continuous data value to discrete values, constructs time-delay embedding vectors, comparents the unique nodes and links in one graph, and determines event forewarning on the basis of several successive occurrences of one (or more) of the dissimilarity measures above a threshold.

  12. Life Events and Psychosis: A Review and Meta-analysis

    PubMed Central

    Beards, Stephanie; Fisher, Helen L.; Morgan, Craig

    2013-01-01

    Introduction:Recent models of psychosis implicate stressful events in its etiology. However, while evidence has accumulated for childhood trauma, the role of adult life events has received less attention. Therefore, a review of the existing literature on the relationship between life events and onset of psychotic disorder/experiences is timely. Methods: A search was conducted using PsychInfo, Medline, Embase, and Web of Science to identify studies of life events and the onset of psychosis or psychotic experiences within the general population. Given previous methodological concerns, this review included a novel quality assessment tool and focused on findings from the most robust studies. A meta-analysis was performed on a subgroup of 13 studies. Results: Sixteen studies published between 1968 and 2012 were included. Of these, 14 reported positive associations between exposure to adult life events and subsequent onset of psychotic disorder/experiences. The meta-analysis yielded an overall weighted OR of 3.19 (95% CI 2.15–4.75). However, many studies were limited by small sample sizes and the use of checklist measures of life events, with no consideration of contextual influences on the meaning and interpretation of events. Conclusions: Few studies have assessed the role of adult life events in the onset of psychosis. There was some evidence that reported exposure to adult life events was associated with increased risk of psychotic disorder and subclinical psychotic experiences. However, the methodological quality of the majority of studies was low, which urges caution in interpreting the results and points toward a need for more methodologically robust studies. PMID:23671196

  13. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Bouilloud, Ludovic; Delrieu, Guy; Boudevillain, Brice; Kirstetter, Pierre-Emmanuel

    2010-11-01

    SummaryA method to estimate rainfall from radar data for post-event analysis of flash-flood events has been developed within the EC-funded HYDRATE project. It follows a pragmatic approach including careful analysis of the observation conditions for the radar system(s) available for the considered case. Clutter and beam blockage are characterised by dry-weather observations and simulations based on a digital terrain model of the region of interest. The vertical profile of reflectivity (VPR) is either inferred from radar data if volume scanning data are available or simply defined using basic meteorological parameters (idealised VPR). Such information is then used to produce correction factor maps for each elevation angle to correct for range-dependent errors. In a second step, an effective Z-R relationship is optimised to remove the bias over the hit region. Due to limited data availability, the optimisation is carried out with reference to raingauge rain amounts measured at the event time scale. Sensitivity tests performed with two well-documented rain events show that a number of Z = aRb relationships, organised along hyperbolic curves in the (a and b) parameter space, lead to optimum assessment results in terms of the Nash coefficient between the radar and raingauge estimates. A refined analysis of these equifinality patterns shows that the “total additive conditional bias” can be used to discriminate between the Nash coefficient equifinal solutions. We observe that the optimisation results are sensitive to the VPR description and also that the Z-R optimisation procedure can largely compensate for range-dependent errors, although this shifts the optimal coefficients in the parameter space. The time-scale dependency of the equifinality patterns is significant, however near-optimal Z-R relationships can be obtained at all time scales from the event time step optimisation.

  14. Case-cohort analysis of clusters of recurrent events.

    PubMed

    Chen, Feng; Chen, Kani

    2014-01-01

    The case-cohort sampling, first proposed in Prentice (Biometrika 73:1-11, 1986), is one of the most effective cohort designs for analysis of event occurrence, with the regression model being the typical Cox proportional hazards model. This paper extends to consider the case-cohort design for recurrent events with certain specific clustering feature, which is captured by a properly modified Cox-type self-exciting intensity model. We discuss the advantage of using this model and validate the pseudo-likelihood method. Simulation studies are presented in support of the theory. Application is illustrated with analysis of a bladder cancer data. PMID:23832308

  15. External events analysis for the Savannah River Site K reactor

    SciTech Connect

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{sup {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.

  16. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  17. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  18. Heterogeneity and event dependence in the analysis of sickness absence

    PubMed Central

    2013-01-01

    Background Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. Methods To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. Results As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. Conclusions The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events, especially with large databases. PMID:24040880

  19. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied

  20. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  1. Offense Specialization of Arrestees: An Event History Analysis

    ERIC Educational Resources Information Center

    Lo, Celia C.; Kim, Young S.; Cheng, Tyrone C.

    2008-01-01

    The data set employed in the present study came from interviews with arrestees conducted between 1999 and 2001 as well as from their official arrest records obtained from jail administrators. A total of 238 arrestees ages 18 to 25 constituted the final sample. Event history analysis examined each arrestee's movement from periods of no arrests to…

  2. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  3. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    SciTech Connect

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-11-22

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility.

  4. Root Cause Analysis: Learning from Adverse Safety Events.

    PubMed

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. PMID:26466177

  5. Analysis of recurrent event data with incomplete observation gaps.

    PubMed

    Kim, Yang-Jin; Jhun, Myoungshic

    2008-03-30

    In analysis of recurrent event data, recurrent events are not completely experienced when the terminating event occurs before the end of a study. To make valid inference of recurrent events, several methods have been suggested for accommodating the terminating event (Statist. Med. 1997; 16:911-924; Biometrics 2000; 56:554-562). In this paper, our interest is to consider a particular situation, where intermittent dropouts result in observation gaps during which no recurrent events are observed. In this situation, risk status varies over time and the usual definition of risk variable is not applicable. In particular, we consider the case when information on the observation gap is incomplete, that is, the starting time of intermittent dropout is known but the terminating time is not available. This incomplete information is modeled in terms of an interval-censored mechanism. Our proposed method is applied to the study of the Young Traffic Offenders Program on conviction rates, wherein a certain proportion of subjects experienced suspensions with intermittent dropouts during the study. PMID:17611955

  6. Analysis of Suprathermal Events Observed by STEREO/PLASTIC

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Farrugia, C. J.; Popecki, M.; Klecker, B.; Ellis, L.; Lee, M. A.; Kistler, L. M.; Luhmann, J. G.; Russell, C. T.; Simunac, K.; Kucharek, H.; Blush, L.; Bochsler, P.; Möbius, E.; Thompson, B. J.; Wimmer-Schweingruber, R.; Wurz, P.

    2008-12-01

    Since the late 1960's, suprathermal and energetic ion events with energies ranging from just above the solar wind energies up to 2MeV and lasting for several minutes to hours, have been detected upstream of the Earth. Possible sources of these ions include magnetospheric ions, solar wind ions accelerated between the Earth's bow shock and hydromagnetic waves to energies just above the solar wind energies, and remnant ions from heliospheric processes (such as Solar Energetic Particle (SEP) events or Corotating Interaction Regions (CIRs)). The unique orbits of both STEREO spacecraft, STEREO-A (STA) drifting ahead in Earth's orbit and STEREO-B (STB) lagging behind in Earth's orbit, allow for analysis of upstream events in these unexamined regions. Using both the PLASTIC and IMPACT instruments on board STA/B we can examine protons in the energy range of solar wind energies up to 80keV, their spatial distribution, and determine if the spacecraft is magnetically connected to the Earth's bow shock. Suprathermal events observed by STEREO/PLASTIC during solar minimum conditions are examined for possible upstream events using anisotropy measurements, velocity dispersion, magnetic connection to the bow shock, and frequency of events as a function of time and distance.

  7. Time-quefrency analysis of overlapping similar microseismic events

    NASA Astrophysics Data System (ADS)

    Nagano, Koji

    2016-05-01

    In this paper, I describe a new technique to determine the interval between P-waves in similar, overlapping microseismic events. The similar microseismic events that occur with overlapping waveforms are called `proximate microseismic doublets' herein. Proximate microseismic doublets had been discarded in previous studies because we had not noticed their usefulness. Analysis of similar events can show relative locations of sources between them. Analysis of proximate microseismic doublets can provide more precise relative source locations because variation in the velocity structure has little influence on their relative travel times. It is necessary to measure the interval between the P-waves in the proximate microseismic doublets to determine their relative source locations. A `proximate microseismic doublet' is a pair of microseismic events in which the second event arrives before the attenuation of the first event. Cepstrum analysis can provide the interval even though the second event overlaps the first event. However, a cepstrum of a proximate microseismic doublet generally has two peaks, one representing the interval between the arrivals of the two P-waves, and the other representing the interval between the arrivals of the two S-waves. It is therefore difficult to determine the peak that represents the P-wave interval from the cepstrum alone. I used window functions in cepstrum analysis to isolate the first and second P-waves and to suppress the second S-wave. I change the length of the window function and calculate the cepstrum for each window length. The result is represented in a three-dimensional contour plot of length-quefrency-cepstrum data. The contour plot allows me to identify the cepstrum peak that represents the P-wave interval. The precise quefrency can be determined from a two-dimensional quefrency-cepstrum graph, provided that the length of the window is appropriately chosen. I have used both synthetic and field data to demonstrate that this

  8. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  9. Using discriminant analysis as a nucleation event classification method

    NASA Astrophysics Data System (ADS)

    Mikkonen, S.; Lehtinen, K. E. J.; Hamed, A.; Joutsensaari, J.; Facchini, M. C.; Laaksonen, A.

    2006-09-01

    More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  10. Using discriminant analysis as a nucleation event classification method

    NASA Astrophysics Data System (ADS)

    Mikkonen, S.; Lehtinen, K. E. J.; Hamed, A.; Joutsensaari, J.; Facchini, M. C.; Laaksonen, A.

    2006-12-01

    More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  11. The new event analysis of the Fermi large area telescope

    NASA Astrophysics Data System (ADS)

    Sgrò, Carmelo

    2014-07-01

    Since its launch on June 11, 2008 the Fermi Large Area Telescope (LAT) has been exploring the gamma-ray sky at energies from 20 MeV to over 300 GeV. Five years of nearly flawless operation allowed a constant improvement of the detector knowledge and, as a consequence, continuous update of the event selection and the corresponding instrument response parametrization. The final product of this effort is a radical revision of the entire event-level analysis, from the event reconstruction algorithms in each subsystem to the background rejection strategy. The potential improvements include a larger acceptance coupled with a significant reduction in background contamination, better angular and energy resolution and an extension of the energy reach below 100 MeV and in the TeV range. In this paper I will describe the new reconstruction and the event-level analysis, show the expected instrument performance and discuss future prospects for astro-particle physics with the LAT.

  12. An analysis of selected atmospheric icing events on test cables

    SciTech Connect

    Druez, J.; McComber, P.; Laflamme, J.

    1996-12-01

    In cold countries, the design of transmission lines and communication networks requires the knowledge of ice loads on conductors. Atmospheric icing is a stochastic phenomenon and therefore probabilistic design is used more and more for structure icing analysis. For strength and reliability assessments, a data base on atmospheric icing is needed to characterize the distributions of ice load and corresponding meteorological parameters. A test site where icing is frequent is used to obtain field data on atmospheric icing. This test site is located on the Mt. Valin, near Chicoutimi, Quebec, Canada. The experimental installation is mainly composed of various instrumented but non-energized test cables, meteorological instruments, a data acquisition system, and a video recorder. Several types of icing events can produce large ice accretions dangerous for land-based structures. They are rime due to in-cloud icing, glaze caused by freezing rain, wet snow, and mixtures of these types of ice. These icing events have very different characteristics and must be distinguished, before statistical analysis, in a data base on atmospheric icing. This is done by comparison of data from a precipitation gauge, an icing rate meter and a temperature sensor. An analysis of selected icing periods recorded on the cables of two perpendicular test lines during the 1992--1993 winter season is presented. Only significant icing events have been considered. A comparative analysis of the ice load on the four test cables is drawn from the data, and typical accretion and shedding parameters are calculated separately for icing events related to in-cloud icing and precipitation icing.

  13. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  14. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  15. Topological Analysis of Emerging Bipole Clusters Producing Violent Solar Events

    NASA Astrophysics Data System (ADS)

    Mandrini, C. H.; Schmieder, B.; Démoulin, P.; Guo, Y.; Cristiani, G. D.

    2014-06-01

    During the rising phase of Solar Cycle 24 tremendous activity occurred on the Sun with rapid and compact emergence of magnetic flux leading to bursts of flares (C to M and even X-class). We investigate the violent events occurring in the cluster of two active regions (ARs), NOAA numbers 11121 and 11123, observed in November 2010 with instruments onboard the Solar Dynamics Observatory and from Earth. Within one day the total magnetic flux increased by 70 % with the emergence of new groups of bipoles in AR 11123. From all the events on 11 November, we study, in particular, the ones starting at around 07:16 UT in GOES soft X-ray data and the brightenings preceding them. A magnetic-field topological analysis indicates the presence of null points, associated separatrices, and quasi-separatrix layers (QSLs) where magnetic reconnection is prone to occur. The presence of null points is confirmed by a linear and a non-linear force-free magnetic-field model. Their locations and general characteristics are similar in both modelling approaches, which supports their robustness. However, in order to explain the full extension of the analysed event brightenings, which are not restricted to the photospheric traces of the null separatrices, we compute the locations of QSLs. Based on this more complete topological analysis, we propose a scenario to explain the origin of a low-energy event preceding a filament eruption, which is accompanied by a two-ribbon flare, and a consecutive confined flare in AR 11123. The results of our topology computation can also explain the locations of flare ribbons in two other events, one preceding and one following the ones at 07:16 UT. Finally, this study provides further examples where flare-ribbon locations can be explained when compared to QSLs and only, partially, when using separatrices.

  16. Analysis of large Danube flood events at Vienna since 1700

    NASA Astrophysics Data System (ADS)

    Kiss, Andrea; Blöschl, Günter; Hohensinner, Severin; Perdigao, Rui

    2014-05-01

    Whereas Danube water level measurements are available in Vienna from 1820 onwards, documentary evidence plays a significant role in the long-term understanding of Danube hydrological processes. Based on contemporary documentary evidence and early instrumental measurements, in the present paper we aim to provide an overview and a hydrological analysis of major Danube flood events, and the changes occurred in flood behaviour in Vienna in the last 300 years. Historical flood events are discussed and analysed according to types, seasonality, frequency and magnitude. Concerning historical flood events we apply a classification of five-scaled indices that considers height, magnitude, length and impacts. The rich data coverage in Vienna, both in terms of documentary evidence and early instrumental measurements, provide us with the possibility to create a relatively long overlap between documentary evidence and instrumental measurements. This makes possible to evaluate and, to some extent, improve the index reconstruction. While detecting causes of changes in flood regime, we aim to provide an overview on the atmospheric background through some characteristic examples, selected great flood events (e.g. 1787). Moreover, we also seek for the answer for such questions as in what way early (pre-instrumental period) human impact such as water regulations and urban development changed flood behaviour in the town, and how much it might have an impact on flood classification.

  17. Mixed-effects Poisson regression analysis of adverse event reports

    PubMed Central

    Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

    2008-01-01

    SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  18. Empirical Green's function analysis of recent moderate events in California

    USGS Publications Warehouse

    Hough, S.E.

    2001-01-01

    I use seismic data from portable digital stations and the broadband Terrascope network in southern California to investigate radiated earthquake source spectra and discuss the results in light of previous studies on both static stress drop and apparent stress. Applying the empirical Green's function (EGF) method to two sets of M 4-6.1 events, I obtain deconvolved source-spectra estimates and corner frequencies. The results are consistent with an ??2 source model and constant Brune stress drop. However, consideration of the raw spectral shapes of the largest events provides evidence for a high-frequency decay more shallow than ??2. The intermediate (???f-1) slope cannot be explained plausibly with attenuation or site effects and is qualitatively consistent with a model incorporating directivity effects and a fractional stress-drop rupture process, as suggested by Haddon (1996). However, the results obtained in this study are not consistent with the model of Haddon (1996) in that the intermediate slope is not revealed with EGF analysis. This could reflect either bandwidth limitations inherent in EGF analysis or perhaps a rupture process that is not self-similar. I show that a model with an intermediate spectral decay can also reconcile the apparent discrepancy between the scaling of static stress drop and that of apparent stress drop for moderate-to-large events.

  19. A Dendrochronological Analysis of Mississippi River Flood Events

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  20. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  1. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  4. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  5. An analysis of three nuclear events in P-Tunnel

    SciTech Connect

    Fourney, W.L.; Dick, R.D.; Taylor, S.R.; Weaver, T.A.

    1994-05-03

    This report examines experimental results obtained from three P Tunnel events -- Mission Cyber, Disko Elm, and Distant Zenith. The objective of the study was to determine if there were any differences in the explosive source coupling for the three events. It was felt that Mission Cyber might not have coupled well because the ground motions recorded for that event were much lower than expected based on experience from N Tunnel. Detailed examination of the physical and chemical properties of the tuff in the vicinity of each explosion indicated only minor differences. In general, the core samples are strong and competent out to at least 60 m from each working point. Qualitative measures of core sample strength indicate that the strength of the tuff near Mission Cyber may be greater than indicated by results of static testing. Slight differences in mineralogic content and saturation of the Mission Cyber tuff were noted relative to the other two tests, but probably would not result in large differences in ground motions. Examination of scaled free-field stress and acceleration records collected by Sandia National Laboratory (SNL) indicated that Disko Elm showed the least scatter and Distant Zenith the most scatter. Mission Cyber measurements tend to lie slightly below those of Distant Zenith, but still within two standard deviations. Analysis of regional seismic data from networks operated by Lawrence Livermore National Laboratory (LLNL) and SNL also show no evidence of Mission Cyber coupling low relative to the other two events. The overall conclusion drawn from the study is that there were no basic differences in the way that the explosions coupled to the rock.

  6. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  7. Mining for adverse drug events with formal concept analysis.

    PubMed

    Estacio-Moreno, Alexander; Toussaint, Yannick; Bousquet, Cédric

    2008-01-01

    The pharmacovigilance databases consist of several case reports involving drugs and adverse events (AEs). Some methods are applied consistently to highlight all signals, i.e. all statistically significant associations between a drug and an AE. These methods are appropriate for verification of more complex relationships involving one or several drug(s) and AE(s) (e.g; syndromes or interactions) but do not address the identification of them. We propose a method for the extraction of these relationships based on Formal Concept Analysis (FCA) associated with disproportionality measures. This method identifies all sets of drugs and AEs which are potential signals, syndromes or interactions. Compared to a previous experience of disproportionality analysis without FCA, the addition of FCA was more efficient for identifying false positives related to concomitant drugs. PMID:18487830

  8. Analysis of warm convective rain events in Catalonia

    NASA Astrophysics Data System (ADS)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  9. CDAW 9 analysis of magnetospheric events on May 3, 1986: Event C

    SciTech Connect

    Baker, D.N. ); Pulkkinen, T.I. Finnshish Meteorological Inst., Helsinki ); McPherron, R.L. Univ. of California, Los Angeles ); Craven, J.D ); Frank, L.A. ); Elphinstone, R.D.; Murphree, J.S. ); Fennell, J.F. ); Lopez, R.E. ); Nagai, T. )

    1993-03-01

    The ninth Coordinated Data Analysis Workshop focused upon several intervals within the PROMIS period. Event interval C comprised the period 0000-1200 UT on May 3, 1986, which was a highly distrubed time near the end of a geomagnetic storm interval. A very large substorm early in the period commenced at 0111 UT and had a peak AE index value of [approximately]1500 nT. Subsequent activity was lower, but at least three other substorms occurred at 2-3 hour intervals. The substorms on May 3 were well observed by a variety of satellites including ISEE 1,2 and IMP 8 in the magnetotail plus SCATHA, GOES, GMS, and LANL spacecraft at or near geostationary orbit. A particularly important feature of the 0111 UT substorm was the simultaneous imaging of the southern auroral oval by DE 1 and of the northern auroral oval by Viking. The excellent constellation of spacecraft near local midnight in the radial range 5-9 R[sub E] made it possible to study the strong cross-tail current development during the expansion phase. A clear latitudinal separation ([ge]10[degrees]) of the initial region of auroral brightening and the region of intense westward electrojet current was identified. The combined ground, near-tail and imaging data for this event provided an unprecedented opportunity to investigate tail current development, field line mapping, and substorm onset mechanisms. Evidence is presented for strong current diversion within the near-tail plasma sheet during the late growth phase and strong current disruption and field-aligned current formation from deeper in the tail at substorm onset. The authors conclude that these results are consistent with a model of magnetic neutral line formation in the late growth phase which causes plasma sheet current diversion before the substorm onset. The expansion phase onset occurs later due to reconnection of lobelike magnetic field lines and roughly concurrent cross-tail disruption in the inner plasma sheet region. 52 refs., 14 figs. 1 tab.

  10. Collective analysis of ORPS-reportable electrical events (June, 2005-August 2009)

    SciTech Connect

    Henins, Rita J; Hakonson - Hayes, Audrey C

    2010-01-01

    The analysis of LANL electrical events between June 30, 2005 and August 31, 2009 provides data that indicate some potential trends regarding ISM failure modes, activity types associated with reportable electrical events, and ORPS causal codes. This report discusses the identified potential trends for Shock events and compares attributes of the Shock events against Other Electrical events and overall ORPS-reportable events during the same time frame.

  11. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  12. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    PubMed Central

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  13. Event-by-Event pseudorapidity fluctuation analysis: An outlook to multiplicity and phase space dependence

    NASA Astrophysics Data System (ADS)

    Bhoumik, Gopa; Bhattacharyya, Swarnapratim; Deb, Argha; Ghosh, Dipak

    2016-07-01

    A detailed study of Event-by-Event pseudorapidity fluctuation of the pions produced in 16O -AgBr interactions at 60A GeV and 32S -AgBr interactions at 200A GeV has been carried out in terms of φ , a variable defined as a measure of fluctuation. Non-zero φ values indicate the presence of strong correlation among the pions for both interactions. Multiplicity and rapidity dependence of the Event-by-Event pseudorapidity fluctuation has been investigated. A decrease of φ with average multiplicity and increase of the same variable with pseudorapidity width are observed. Decrease of φ with average multiplicity is concluded as the particle emission by several independent sources occurs for higher-multiplicity events. The increase in φ values with pseudorapidity width, taken around central rapidity, might hint towards the presence of long-range correlation and its dominance over short range one. We have compared our experimental results with Monte Carlo simulation generated assuming independent particle emission. Comparison shows that the source of correlation and fluctuation is the dynamics of the pion production process. We have also compared our results with events generated by FRITIOF code. Such events also show the presence of fluctuation and correlation; however they fail to replicate the experimental findings.

  14. An event history analysis of union joining and leaving.

    PubMed

    Buttigieg, Donna M; Deery, Stephen J; Iverson, Roderick D

    2007-05-01

    This article examines parallel models of union joining and leaving using individual-level longitudinal panel data collected over a 5-year period. The authors utilized objective measures of joining and leaving collected from union and organizational records and took into account time by using event history analysis. The results indicated that union joining was negatively related to procedural justice and higher performance appraisals and positively related to partner socialization and extrinsic union instrumentality. Conversely, members were most likely to leave the union when they perceived lower procedural justice, where there was no union representative present in the workplace, and where they had individualistic orientations. The authors discuss the implications of these findings for theory and practice for trade unions. PMID:17484562

  15. CDAW-9 analysis of magnetospheric events on 3 May 1986: Event C. Technical report

    SciTech Connect

    Baker, D.N.; Pulkkinen, T.I.; McPherron, R.L.; Craven, J.D.; Frank, L.A.

    1993-10-01

    The ninth Coordinated Data Analysis Workshop (CDAW-9) focussed upon several intervals within the PROMIS period (March-June 1986). Event interval C comprised the period 0000-1200 UT on 3 May 1986 which was a highly disturbed time near the end of a geomagnetic storm interval. A very large substorm early in the period commenced at 0111 UT and had a peak AE index value of approx. 1500 nT. Subsequent activity was lower, but at least three other substorms occurred at 2-3 hour intervals. The substorms on 3 May were well observed by a variety of satellites, including ISEE-1, -2, and IMP-8 in the magnetotail plus SCATHA, GOES, GMS, and LANL spacecraft at or near geostationary orbit. A particularly important feature of the 0111 UT substorm was the simultaneous imaging of the southern auroral oval by DE-1 and of the northern oval by Viking. The excellent constellation of spacecraft near local midnight in the radial range 5-9 RE made it possible to study the strong cross-tail current development during the substorm growth phase and the current disruption and current wedge development during the expansion phase. The authors use a time-evolving magnetic field model to map observed auroral features out into the magnetospheric equatorial plane. There was both a dominant eastward and a weaker westward progression of activity following the expansion phase. A clear latitudinal separation of the initial region of auroral brightening and the region of intense westward electrojet current was identified.

  16. Bootstrap analysis of the single subject with event related potentials.

    PubMed

    Oruç, Ipek; Krigolson, Olav; Dalrymple, Kirsten; Nagamatsu, Lindsay S; Handy, Todd C; Barton, Jason J S

    2011-07-01

    Neural correlates of cognitive states in event-related potentials (ERPs) serve as markers for related cerebral processes. Although these are usually evaluated in subject groups, the ability to evaluate such markers statistically in single subjects is essential for case studies in neuropsychology. Here we investigated the use of a simple test based on nonparametric bootstrap confidence intervals for this purpose, by evaluating three different ERP phenomena: the face-selectivity of the N170, error-related negativity, and the P3 component in a Posner cueing paradigm. In each case, we compare single-subject analysis with statistical significance determined using bootstrap to conventional group analysis using analysis of variance (ANOVA). We found that the proportion of subjects who show a significant effect at the individual level based on bootstrap varied, being greatest for the N170 and least for the P3. Furthermore, it correlated with significance at the group level. We conclude that the bootstrap methodology can be a viable option for interpreting single-case ERP amplitude effects in the right setting, probably with well-defined stereotyped peaks that show robust differences at the group level, which may be more characteristic of early sensory components than late cognitive effects. PMID:22292858

  17. Analysis of marine stratocumulus clearing events during FIRE

    NASA Technical Reports Server (NTRS)

    Kloesel, Kevin A.

    1990-01-01

    During FIRE, three major stratocumulus clearing events took place over the project region. These clearing events are analyzed using synoptic variables to determine if these clearing events can be predicted by current modeling techniques. A preliminary statistical evaluation of the correlation between satellite cloud brightness parameters and NMC global model parameters is available in Wylie, et al., 1989.

  18. Analysis of Events Associated with First Charge of Desicooler Material

    SciTech Connect

    Alexander, D.E.

    2003-09-15

    HB-Line's mission included dissolution of uranium-aluminum scrap left over from a U3O8 scrap recovery program begun in 1972 with material returned from Rocky Flats and Oak Ridge. This material has been stored in desicooler containers, and is commonly referred to as the Desicoolers. The Scrap Recovery process includes the dissolution of scrap material and transfer of the resulting solution to H-Canyon for further disposition. During the first charge of this material into the HB-Line dissolvers, the solution heated to boiling without external heat being added. Yellow-colored fumes, which dissipated rapidly, were noted in the glovebox by operators, and a small amount of liquid was noted in the glovebox by operations after dissolver cooldown. This technical report documents analysis of the data from the event with respect to potential Safety Basis violation and the Integrated Safety Management System process. Based on the analysis presented, the safety basis has shown its ability to protect the worker, the facility and the public.

  19. Cluster analysis of indermediate deep events in the southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2015-04-01

    The Hellenic subduction zone (HSZ) is the seismically most active region in Europe where the oceanic African litosphere is subducting beneath the continental Aegean plate. Although there are numerous studies of seismicity in the HSZ, very few focus on the eastern HSZ and the Wadati-Benioff-Zone of the subducting slab in that part of the HSZ. In order to gain a better understanding of the geodynamic processes in the region a dense local seismic network is required. From September 2005 to March 2007, the temporary seismic network EGELADOS has been deployed covering the entire HSZ. It consisted of 56 onshore and 23 offshore broadband stations with addition of 19 stations from GEOFON, NOA and MedNet to complete the network. Here, we focus on a cluster of intermediate deep seismicity recorded by the EGELADOS network within the subducting African slab in the region of the Nysiros volcano. The cluster consists of 159 events at 80 to 190 km depth with magnitudes between 0.2 and 4.1 that were located using nonlinear location tool NonLinLoc. A double-difference earthquake relocation using the HypoDD software is performed with both manual readings of onset times and differential traveltimes obtained by separate cross correlation of P- and S-waveforms. Single event locations are compared to relative relocations. The event hypocenters fall into a thin zone close to the top of the slab defining its geometry with an accuracy of a few kilometers. At intermediate depth the slab is dipping towards the NW at an angle of about 30°. That means it is dipping steeper than in the western part of the HSZ. The edge of the slab is clearly defined by an abrupt disappearance of intermediate depths seismicity towards the NE. It is found approximately beneath the Turkish coastline. Furthermore, results of a cluster analysis based on the cross correlation of three-component waveforms are shown as a function of frequency and the spatio-temporal migration of the seismic activity is analysed.

  20. Event Detection and Spatial Analysis for Characterizing Extreme Precipitation

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Prabhat, M.; Byna, S.; Collins, W.; Wehner, M. F.

    2013-12-01

    Atmospheric Rivers (ARs) are large spatially coherent weather systems with high concentrations of elevated water vapor that often cause severe downpours and flooding over western coastal United States. With the availability of more atmospheric moisture in the future under global warming, we expect ARs to play an important role as a potential cause of extreme precipitation. We have recently developed TECA software for automatically identifying and tracking features in climate datasets. In particular, we are able to identify ARs that make landfall on the western coast of North America. This detection tool examines integrated water vapor field above a certain threshold and performs geometric analysis. Based on the detection procedure, we investigate impacts of ARs by exploring spatial extent of AR precipitation for CMIP5 simulations, and characterize spatial pattern of dependence for future projections under climate change within the framework of extreme value theory. The results show that AR events in RCP8.5 scenario (2076-2100) tend to produce heavier rainfall with higher frequency and longer duration than the events from historical run (1981-2005). Range of spatial dependence between extreme precipitations is concentrated on smaller localized area in California under the highest emission scenario than present day. Preliminary results are illustrated in Figure 1 and 2. Fig 1: Boxplot of annual max precipitation (left two) and max AR precipitation (right two) from GFDL-ESM2M during 25-year time period by station in California, US. Fig 2: Spatial dependence of max AR precipitation calculated from Station 4 (triangle) for historical run (left) and for future projections of RCP8.5 (right) from GFDL-ESM2M. Green and orange colors represent complete dependence and independence between two stations respectively.

  1. The Tunguska event and Cheko lake origin: dendrochronological analysis

    NASA Astrophysics Data System (ADS)

    Rosanna, Fantucci; Romano, Serra; Gunther, Kletetschka; Mario, Di Martino

    2015-07-01

    Dendrochronological research was carried out on 23 trees samples (Larix sibirica and Picea obovata) sampled during the 1999 expedition in two locations, close to the epicentre zone and near Cheko lake (N 60°57', E 101°51'). Basal Area Increment (BAI) analysis has shown a general long growth suppression before 1908, the year of Tunguska event (TE), followed by a sudden growth increase due to diminished competition of trees that died due to the event. In one group of the trees, we detected growth decrease for several years (due to damage to the trunk, branches and crown), followed by growth increase during the following 4-14 years. We show that trees that germinated after the TE, and living in close proximity of Cheko lake (Cheko lake trees) had different behaviour patterns when compared to those trees living further from Cheko lake, inside the forest (Forest trees). Cheko lake trees have shown a vigorous continuous growth increase. Forest trees have shown a vigorous growth during the first 10-30 years of age, followed by a period of suppressed growth. We interpret the suppressed growth by the re-established competition with the surroundings trees. Cheko lake pattern, however, is consistent with the formation of the lake at the time of TE. This observation supports the hypothesis that Cheko lake formation is due to a fragment originating during TE, creating a small impact crater into the permafrost and soft alluvial deposits of Kimku River plain. This is further supported by the fact that Cheko lake has an elliptical shape elongated towards the epicentre of TE.

  2. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  3. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  4. Teleseismic Events Analysis with AQDB and ITAB stations, Brazil

    NASA Astrophysics Data System (ADS)

    Felício, L. D.; Vasconcello, E.; Assumpção, M.; Rodrigues, F.; Facincani, E.; Dias, F.

    2013-05-01

    This work aims to preferentially conduct the survey of seismic activity coming from the Andean region at distance over 1500 km recorded by Brazilian seismographic stations of AQDB and ITAB in 2012. The stations are located in the cities of Aquidauana and Itajai, both in central-west region in Brazil, with coordinates -20°48'S;-55°70'W and -27°24'S;-52°13'W, respectively. We determined the magnitudes mb and Ms,epicentral distance, arrival times of P waves experimental and theoretical (using IASP91 model) . With the programs SAC (SEISMIC ANALYSIS CODE), TAUP and Seisgram (Seismogram Viewer), it was possible to determine the mentioned magnitudes. We identified around twenty events for each station and it was possible to correlate the magnitude data published in the Bulletin National Earthquake Information Center (NEIC) generating a correlation between the calculated magnitudes (AQDB and ITAB).. The linear regression shows that the two stations mb and Ms magnitude are close to the values reported by the NEIC (97.1% correlation mb and Ms 96.5%). Regarding the P-wave arrive times at stations ITAB and AQDB indicate an average variation of 2.2 and 2.7 seconds respectively, in other words, the time difference of the waves P (experimental and theoretical) may be related to positioning each station and the heterogeneity of the structure and composition of the rocky massive in each region.

  5. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  6. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  7. Regional Frequency Analysis of extreme rainfall events, Tuscany (Italy)

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Chiarello, V.; Rossi, G.

    2014-12-01

    The assessment of extreme hydrological events at sites characterized by short time series or where no data record exists has been mainly obtained by regional models. Regional frequency analysis based on the index variable procedure is implemented here to describe the annual maximum of rainfall depth of short durations in Tuscany region. The probability distribution TCEV - Two Component Extreme Value is used in the frame of the procedure for the parameters estimation based on a three levels hierarchical approach. The methodology deal with the delineation of homogeneous regions, the identification of a robust regional frequency distribution and the assessment of the scale factor, i.e. the index rainfall. The data set includes the annual maximum of daily rainfall of 351 gauge stations with at least 30 years of records, in the period 1916 - 2012, and the extreme rainfalls of short duration, 1 hour and 3, 6, 12, 24 hours. Different subdivisions hypotheses have been verified. A four regions subdivision, coincident with four subregions, which takes into account the orography, the geomorphological and climatic peculiarities of the Tuscany region, has been adopted. Particularly, for testing the regional homogeneity, the cumulate frequency distributions of the observed skewness and variation coefficients of the recorded times series, are compared with the theoretical frequency distribution obtained through a Monte Carlo technique. The related L-skewness and L-variation coefficients are also examined. The application of the Student t -test and the Wilcoxon test for the mean, as well as the χ2 was also performed. Further tests of subdivision hypotheses have been made through the application of discordancy D and heterogeneity H tests and the analysis of the observed and the theoretical TCEV model growth curves. For each region the daily rainfall growth curve has been estimated. The growth curves for the hourly duration have been estimated when the daily rainfall growth curve

  8. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  9. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  10. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  11. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  12. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  13. [MedDRA and its applications in statistical analysis of adverse events].

    PubMed

    Lu, Meng-jie; Liu, Yu-xiu

    2015-11-01

    Safety assessment in clinical trials is dependent on an in-depth analysis of the adverse events to a great extent. However, there are difficulties in summary classification, data management and statistical analysis of the adverse events because of the different expressions on the same adverse events caused by regional, linguistic, ethnic, cultural and other differences. In order to ensure the normative expressions, it's necessary to standardize the terms in recording the adverse events. MedDRA (medical dictionary for regulatory activities) has been widely recommended and applied in the world as a powerful support for the adverse events reporting in clinical trials. In this paper, the development history, applicable scope, hierarchy structure, encoding term selection and standardized query strategies of the MedDRA is introduced. Furthermore, the practical process of adverse events encoding with MedDRA is proposed. Finally, the framework of statistical analysis about adverse events is discussed. PMID:26911031

  14. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  15. Analysis of cumulus solar irradiance reflectance (CSIR) events

    NASA Astrophysics Data System (ADS)

    Laird, John L.; Harshvardhan

    Clouds are extremely important with regard to the transfer of solar radiation at Earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When Sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using UVA and UVB pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Win -2 and 0.0169 Wm -2 were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of Sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed. C 1997 Elsevier Science B.V.

  16. CDAW 9 analysis of magnetospheric events on May 3, 1986 - Event C

    NASA Technical Reports Server (NTRS)

    Baker, D. N.; Pulkkinen, T. I.; Mcpherron, R. L.; Craven, J. D.; Frank, L. A.; Elphinstone, R. D.; Murphree, J. S.; Fennell, J. F.; Lopez, R. E.; Nagai, T.

    1993-01-01

    An intense geomagnetic substorm event on May 3, 1986, occurring toward the end of a strong storm period, is studied. The auroral electrojet indices and global imaging data from both the Northern and Southern Hemispheres clearly revealed the growth phase and expansion phase development for a substorm with an onset at 0111 UT. An ideally located constellation of four spacecraft allowed detailed observation of the substorm growth phase in the near-tail region. A realistic time-evolving magnetic field model provided a global representation of the field configuration throughout the growth and early expansion phase of the substorm. Evidence of a narrowly localized substorm onset region in the near-earth tail is found. This region spread rapidly eastward and poleward after the 0111 UT onset. The results are consistent with a model of late growth phase formation of a magnetic neutral line. This reconnection region caused plasma sheet current diversion before the substorm onset and eventually led to cross-tail current disruption at the time of the substorm onset.

  17. Application of Key Events Analysis to Chemical Carcinogens and Noncarcinogens

    PubMed Central

    BOOBIS, ALAN R.; DASTON, GEORGE P.; PRESTON, R. JULIAN; OLIN, STEPHEN S.

    2009-01-01

    The existence of thresholds for toxicants is a matter of debate in chemical risk assessment and regulation. Current risk assessment methods are based on the assumption that, in the absence of sufficient data, carcinogenesis does not have a threshold, while noncarcinogenic endpoints are assumed to be thresholded. Advances in our fundamental understanding of the events that underlie toxicity are providing opportunities to address these assumptions about thresholds. A key events dose-response analytic framework was used to evaluate three aspects of toxicity. The first section illustrates how a fundamental understanding of the mode of action for the hepatic toxicity and the hepatocarcinogenicity of chloroform in rodents can replace the assumption of low-dose linearity. The second section describes how advances in our understanding of the molecular aspects of carcinogenesis allow us to consider the critical steps in genotoxic carcinogenesis in a key events framework. The third section deals with the case of endocrine disrupters, where the most significant question regarding thresholds is the possible additivity to an endogenous background of hormonal activity. Each of the examples suggests that current assumptions about thresholds can be refined. Understanding inter-individual variability in the events involved in toxicological effects may enable a true population threshold(s) to be identified. PMID:19690995

  18. Further Evaluation of Antecedent Social Events during Functional Analysis

    ERIC Educational Resources Information Center

    Kuhn, David E.; Hardesty, Samantha L.; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors…

  19. Wheels within Wheels: The Analysis of a Cultural Event.

    ERIC Educational Resources Information Center

    Court, Deborah

    2001-01-01

    A qualitative research methods course, offered for faculty at an Israeli school of education, was a significant "cultural event" promoting inclusion of qualitative methods in a strongly positivist setting. A profound change in faculty attitudes was eased by administrative support and by participants' ability to find entry points to ideas through…

  20. Event-based prediction of stream turbidity using a combined cluster analysis and classification tree approach

    NASA Astrophysics Data System (ADS)

    Mather, Amanda L.; Johnson, Richard L.

    2015-11-01

    Stream turbidity typically increases during streamflow events; however, similar event hydrographs can produce markedly different event turbidity behaviors because many factors influence turbidity in addition to streamflow, including antecedent moisture conditions, season, and supply of turbidity-causing materials. Modeling of sub-hourly turbidity as a function of streamflow shows that event model parameters vary on an event-by-event basis. Here we examine the extent to which stream turbidity can be predicted through the prediction of event model parameters. Using three mid-sized streams from the Mid-Atlantic region of the U.S., we show the model parameter set for each event can be predicted based on the event characteristics (e.g., hydrologic, meteorologic and antecedent moisture conditions) using a combined cluster analysis and classification tree approach. The results suggest that the ratio of beginning event discharge to peak event discharge (an estimate of the event baseflow index), as well as catchment antecedent moisture, are important factors in the prediction of event turbidity. Indicators of antecedent moisture, particularly those derived from antecedent discharge, account for the majority of the splitting nodes in the classification trees for all three streams. For this study, prediction of turbidity during streamflow events is based upon observed data (e.g., measured streamflow, precipitation and air temperature). However, the results also suggest that the methods presented here can, in future work, be used in conjunction with forecasts of streamflow, precipitation and air temperature to forecast stream turbidity.

  1. Early events in cell spreading as a model for quantitative analysis of biomechanical events.

    PubMed

    Wolfenson, Haguy; Iskratsch, Thomas; Sheetz, Michael P

    2014-12-01

    In this review, we focus on the early events in the process of fibroblast spreading on fibronectin matrices of different rigidities. We present a focused position piece that illustrates the many different tests that a cell makes of its environment before it establishes mature matrix adhesions. When a fibroblast is placed on fibronectin-coated glass surfaces at 37°C, it typically spreads and polarizes within 20-40 min primarily through αvβ3 integrin binding to fibronectin. In that short period, the cell goes through three major phases that involve binding, integrin activation, spreading, and mechanical testing of the surface. The advantage of using the model system of cell spreading from the unattached state is that it is highly reproducible and the stages that the cell undergoes can thus be studied in a highly quantitative manner, in both space and time. The mechanical and biochemical parameters that matter in this example are often surprising because of both the large number of tests that occur and the precision of the tests. We discuss our current understanding of those tests, the decision tree that is involved in this process, and an extension to the behavior of the cells at longer time periods when mature adhesions develop. Because many other matrices and integrins are involved in cell-matrix adhesion, this model system gives us a limited view of a subset of cellular behaviors that can occur. However, by defining one cellular process at a molecular level, we know more of what to expect when defining other processes. Because each cellular process will involve some different proteins, a molecular understanding of multiple functions operating within a given cell can lead to strategies to selectively block a function. PMID:25468330

  2. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  3. Application of a Temporal Reasoning Framework Tool in Analysis of Medical Device Adverse Events

    PubMed Central

    Clark, Kimberly K.; Sharma, Deepak K.; Chute, Christopher G.; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration’s (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events. PMID:22195199

  4. Analysis and RHBD technique of single event transients in PLLs

    NASA Astrophysics Data System (ADS)

    Zhiwei, Han; Liang, Wang; Suge, Yue; Bing, Han; Shougang, Du

    2015-11-01

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented.

  5. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  6. Observation and Analysis of Jovian and Saturnian Satellite Mutual Events

    NASA Technical Reports Server (NTRS)

    Tholen, David J.

    2001-01-01

    The main goal of this research was to acquire high time resolution photometry of satellite-satellite mutual events during the equatorial plane crossing for Saturn in 1995 and Jupiter in 1997. The data would be used to improve the orbits of the Saturnian satellites to support Cassini mission requirements, and also to monitor the secular acceleration of Io's orbit to compare with heat flow measurements.

  7. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    NASA Astrophysics Data System (ADS)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  8. Analysis of Adverse Events in Identifying GPS Human Factors Issues

    NASA Technical Reports Server (NTRS)

    Adams, Catherine A.; Hwoschinsky, Peter V.; Adams, Richard J.

    2004-01-01

    The purpose of this study was to analyze GPS related adverse events such as accidents and incidents (A/I), Aviation Safety Reporting System (ASRS) reports and Pilots Deviations (PDs) to create a framework for developing a human factors risk awareness program. Although the occurrence of directly related GPS accidents is small the frequency of PDs and ASRS reports indicated there is a growing problem with situational awareness in terminal airspace related to different types of GPs operational issues. This paper addresses the findings of the preliminary research and a brief discussion of some of the literature on related GPS and automation issues.

  9. Analysis of broadband seismograms from selected IASPEI events

    USGS Publications Warehouse

    Choy, G.L.; Engdahl, E.R.

    1987-01-01

    Broadband seismograms of body waves that are flat to displacement and velocity in the frequency range from 0.01 to 5.0 Hz can now be routinely obtained for most earthquakes of magnitude greater than about 5.5. These records are obtained either directly or through multichannel deconvolution of waveforms from digitally recording seismograph stations. In contrast to data from conventional narrowband seismographs, broadband records have sufficient frequency content to define the source-time functions of body waves, even for shallow events for which the source functions of direct and surface-reflected phases may overlap. Broadband seismograms for selected IASPEI events are systematically analysed to identify depth phases and the presence of subevents. The procedure results in improved estimates of focal depth, identification of subevents in complex earthquakes, and better resolution of focal mechanisms. We propose that it is now possible for reporting agencies, such as the National Earthquake Information Center, to use broadband digital waveforms routinely in the processing of earthquake data. ?? 1987.

  10. Analysis of Continuous Microseismic Recordings: Resonance Frequencies and Unconventional Events

    NASA Astrophysics Data System (ADS)

    Tary, J.; van der Baan, M.

    2012-12-01

    Hydrofracture experiments, where fluids and proppant are injected into reservoirs to create fractures and enhance oil recovery, are often monitored using microseismic recordings. The total stimulated volume is then estimated by the size of the cloud of induced micro-earthquakes. This implies that only brittle failure should occur inside reservoirs during the fracturing. Yet, this assumption may not be correct, as the total energy injected into the system is orders of magnitude larger than the total energy associated with brittle failure. Instead of using only triggered events, it has been shown recently that the frequency content of continuous recordings may also provide information on the deformations occurring inside reservoirs. Here, we use different kinds of time-frequency transforms to track the presence of resonance frequencies. We analyze different data sets using regular, long-period and broadband geophones. The resonance frequencies observed are mainly included in the frequency band of 5-60 Hz. We systematically examine first the possible causes of resonance frequencies, dividing them into source, path and receiver effects. We then conclude that some of the observed frequency bands likely result from source effects. The resonance frequencies could be produced by either interconnected fluid-filled fractures in the order of tens of meters, or by small repetitive events occurring at a characteristic periodicity. Still, other mechanisms may occur or be predominant during reservoir fracturing, depending on the lithology as well as the pressure and temperature conditions at depth. During one experiment, both regular micro-earthquakes, long-period long-duration events (LPLD) and resonance frequencies are observed. The lower part of the frequency band of these resonance frequencies (5-30 Hz) overlaps with the anticipated frequencies of observed LPLDs in other experiments (<50 Hz). The exact origin of both resonance frequencies and LPLDs is still under debate

  11. Combined cardiotocographic and ST event analysis: A review.

    PubMed

    Amer-Wahlin, Isis; Kwee, Anneke

    2016-01-01

    ST-analysis of the fetal electrocardiogram (ECG) (STAN(®)) combined with cardiotocography (CTG) for intrapartum fetal monitoring has been developed following many years of animal research. Changes in the ST-segment of the fetal ECG correlated with fetal hypoxia occurring during labor. In 1993 the first randomized controlled trial (RCT), comparing CTG with CTG + ST-analysis was published. STAN(®) was introduced for daily practice in 2000. To date, six RCTs have been performed, out of which five have been published. Furthermore, there are six published meta-analyses. The meta-analyses showed that CTG + ST-analysis reduced the risks of vaginal operative delivery by about 10% and fetal blood sampling by 40%. There are conflicting results regarding the effect on metabolic acidosis, much because of controveries about which RCTs should be included in a meta-analysis, and because of differences in methodology, execution and quality of the meta-analyses. Several cohort studies have been published, some showing significant decrease of metabolic acidosis after the introduction of ST-analysis. In this review, we discuss not only the scientific evidence from the RCTs and meta-analyses, but also the limitations of these studies. In conclusion, ST-analysis is effective in reducing operative vaginal deliveries and fetal blood sampling but the effect on neonatal metabolic acidosis is still under debate. Further research is needed to determine the place of ST-analysis in the labor ward for daily practice. PMID:26206514

  12. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  13. Data integration and analysis using the Heliophysics Event Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal; Reardon, Kevin

    The Heliophysics Event Knowledgebase (HEK) system provides an integrated framework for automated data mining using a variety of feature-detection methods; high-performance data systems to cope with over 1TB/day of multi-mission data; and web services and clients for searching the resulting metadata, reviewing results, and efficiently accessing the data products. We have recently enhanced the capabilities of the HEK to support the complex datasets being produced by the Interface Region Imaging Spectrograph (IRIS). We are also developing the mechanisms to incorporate descriptions of coordinated observations from ground-based facilities, including the NSO's Dunn Solar Telescope (DST). We will discuss the system and its recent evolution and demonstrate its ability to support coordinated science investigations.

  14. Perceptually-driven signal analysis for acoustic event classification

    NASA Astrophysics Data System (ADS)

    Philips, Scott M.

    In many acoustic signal processing applications human listeners are able to outperform automated processing techniques, particularly in the identification and classification of acoustic events. The research discussed in this paper develops a framework for employing perceptual information from human listening experiments to improve automatic event classification. We focus on the identification of new signal attributes, or features, that are able to predict the human performance observed in formal listening experiments. Using this framework, our newly identified features have the ability to elevate automatic classification performance closer to the level of human listeners. We develop several new methods for learning a perceptual feature transform from human similarity measures. In addition to providing a more fundamental basis for uncovering perceptual features than previous approaches, these methods also lead to a greater insight into how humans perceive sounds in a dataset. We also develop a new approach for learning a perceptual distance metric. This metric is shown to be applicable to modern kernel-based techniques used in machine learning and provides a connection between the fields of psychoacoustics and machine learning. Our research demonstrates these new methods in the area of active sonar signal processing. There is anecdotal evidence within the sonar community that human operators are adept in the task of discriminating between active sonar target and clutter echoes. We confirm this ability in a series of formal listening experiments. With the results of these experiments, we then identify perceptual features and distance metrics using our novel methods. The results show better agreement with human performance than previous approaches. While this work demonstrates these methods using perceptual similarity measures from active sonar data, they are applicable to any similarity measure between signals.

  15. A regional analysis of event runoff coefficients with respect to climate and catchment characteristics in Austria

    NASA Astrophysics Data System (ADS)

    Merz, Ralf; BlöSchl, Günter

    2009-01-01

    In this paper we analyze the controls on the spatiotemporal variability of event runoff coefficients. A total of about 64,000 events in 459 Austrian catchments ranging from 5 to 10000 km2 are analyzed. Event runoff coefficients vary in space, depending on the long-term controls such as climate and catchment formation. Event runoff coefficients also vary in time, depending on event characteristics such as antecedent soil moisture and event rainfall depth. Both types of controls are analyzed separately in the paper. The spatial variability is analyzed in terms of a correlation analysis of the statistical moments of the runoff coefficients and catchment attributes. Mean runoff coefficients are most strongly correlated to indicators representing climate such as mean annual precipitation and the long-term ratio of actual evaporation to precipitation through affecting long-term soil moisture. Land use, soil types, and geology do not seem to exert a major control on runoff coefficients of the catchments under study. The temporal variability is analyzed by comparing the deviation of the event runoff coefficients from their mean depending on event characteristics. The analysis indicates that antecedent soil moisture conditions control runoff coefficients to a higher degree than does event rainfall. The analysis also indicates that soil moisture derived from soil moisture accounting schemes has more predictive power for the temporal variability of runoff coefficients than antecedent rainfall.

  16. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  17. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  18. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26689725

  19. Technical issues: flow cytometry and rare event analysis.

    PubMed

    Hedley, B D; Keeney, M

    2013-06-01

    Flow cytometry has become an essential tool for identification and characterization of hematological cancers and now, due to technological improvements, allows the identification and rapid enumeration of small tumor populations that may be present after induction therapy (minimal residual disease, MRD). The quantitation of MRD has been shown to correlate with relapse and survival rates in numerous diseases and in certain cases, and evidence of MRD is used to alter treatment protocols. Recent improvements in hardware allow for high data rate collection. Improved fluorochromes take advantage of violet laser excitation and maximize signal-to-noise ratio allowing the population of interest to be isolated in multiparameter space. This isolation, together with a low background rate, permits for detection of residual tumor populations in a background of normal cells. When counting such rare events, the distribution is governed by Poisson statistics, with precision increasing with higher numbers of cells collected. In several hematological malignancies, identification of populations at frequencies of 0.01% and lower has been attained. The choice of antibodies used in MRD detection facilitates the definition of a fingerprint to identify abnormal populations throughout treatment. Tumor populations can change phenotype, and an approach that relies on 'different from normal' has proven useful, particularly in the acute leukemias. Flow cytometry can and is used for detection of MRD in many hematological diseases; however, standardized approaches for specific diseases must be developed to ensure precise identification and enumeration that may alter the course of patient treatment. PMID:23590661

  20. Statistical analysis of geodetic networks for detecting regional events

    NASA Technical Reports Server (NTRS)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  1. Twelve Tips for Promoting Significant Event Analysis To Enhance Reflection in Undergraduate Medical Students.

    ERIC Educational Resources Information Center

    Henderson, Emma; Berlin, Anita; Freeman, George; Fuller, Jon

    2002-01-01

    Points out the importance of the facilitation of reflection and development of reflective abilities in professional development and describes 12 tips for undergraduate medical students to increase their abilities of writing reflective and creative event analysis. (Author/YDS)

  2. An analysis of fog events at Belgrade International Airport

    NASA Astrophysics Data System (ADS)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  3. Analysis of 16 plasma vortex events in the geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Birn, J.; Hones, E. W., Jr.; Bame, S. J.; Russell, C. T.

    1985-01-01

    The analysis of 16 plasma vortex occurrences in the magnetotail plasma sheet of Hones et al. (1983) is extended. Two- and three-dimensional plasma measurements and three-dimensional magnetic field measurements were used to study phase relations, energy propagation, and polarization properties. The results point toward an interpretation as a slow strongly damped MHD eigenmode which is generated by tailward traveling perturbations at the low-latitude interface between plasma sheet and magnetosheath.

  4. Subjective well-being and adaptation to life events: a meta-analysis.

    PubMed

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E

    2012-03-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on affective and cognitive well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to 4 family events (marriage, divorce, bereavement, childbirth) and 4 work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843

  5. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  6. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  7. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-07-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  8. Modeling and analysis of single-event transients in charge pumps

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Junfeng, Li; Minxuan, Zhang; Shaoqing, Li

    2009-05-01

    It has been shown that charge pumps (CPs) dominate single-event transient (SET) responses of phase-locked loops (PLLs). Using a pulse to represent a single event hit on CPs, the SET analysis model is established and the characteristics of SET generation and propagation in PLLs are revealed. An analysis of single event transients in PLLs demonstrates that the settling time of the voltage-controlled oscillators (VCOs) control voltage after a single event strike is strongly dependent on the peak control voltage deviation, the SET pulse width, and the settling time constant. And the peak control voltage disturbance decreases with the SET strength or the filter resistance. Furthermore, the analysis in the proposed PLL model is confirmed by simulation results using MATLAB and HSPICE, respectively.

  9. Long-term Statistical Analysis of the Simultaneity of Forbush Decrease Events at Middle Latitudes

    NASA Astrophysics Data System (ADS)

    Lee, Seongsuk; Oh, Suyeon; Yi, Yu; Evenson, Paul; Jee, Geonhwa; Choi, Hwajin

    2015-03-01

    Forbush Decreases (FD) are transient, sudden reductions of cosmic ray (CR) intensity lasting a few days, to a week. Such events are observed globally using ground neutron monitors (NMs). Most studies of FD events indicate that an FD event is observed simultaneously at NM stations located all over the Earth. However, using statistical analysis, previous researchers verified that while FD events could occur simultaneously, in some cases, FD events could occur non-simultaneously. Previous studies confirmed the statistical reality of non-simultaneous FD events and the mechanism by which they occur, using data from high-latitude and middle-latitude NM stations. In this study, we used long-term data (1971-2006) from middle-latitude NM stations (Irkutsk, Climax, and Jungfraujoch) to enhance statistical reliability. According to the results from this analysis, the variation of cosmic ray intensity during the main phase, is larger (statistically significant) for simultaneous FD events, than for non-simultaneous ones. Moreover, the distribution of main-phase-onset time shows differences that are statistically significant. While the onset times for the simultaneous FDs are distributed evenly over 24- hour intervals (day and night), those of non-simultaneous FDs are mostly distributed over 12-hour intervals, in daytime. Thus, the existence of the two kinds of FD events, according to differences in their statistical properties, were verified based on data from middle-latitude NM stations.

  10. Computational methods for analysis of dynamic events in cell migration.

    PubMed

    Castañeda, V; Cerda, M; Santibáñez, F; Jara, J; Pulgar, E; Palma, K; Lemus, C G; Osorio-Reich, M; Concha, M L; Härtel, S

    2014-02-01

    Cell migration is a complex biological process that involves changes in shape and organization at the sub-cellular, cellular, and supra-cellular levels. Individual and collective cell migration can be assessed in vitro and in vivo starting from the flagellar driven movement of single sperm cells or bacteria, bacterial gliding and swarming, and amoeboid movement to the orchestrated movement of collective cell migration. One key technology to access migration phenomena is the combination of optical microscopy with image processing algorithms. This approach resolves simple motion estimation (e.g. preferred direction of migrating cells or path characteristics), but can also reveal more complex descriptors (e.g. protrusions or cellular deformations). In order to ensure an accurate quantification, the phenomena under study, their complexity, and the required level of description need to be addressed by an adequate experimental setup and processing pipeline. Here, we review typical workflows for processing starting with image acquisition, restoration (noise and artifact removal, signal enhancement), registration, analysis (object detection, segmentation and characterization) and interpretation (high level understanding). Image processing approaches for quantitative description of cell migration in 2- and 3-dimensional image series, including registration, segmentation, shape and topology description, tracking and motion fields are presented. We discuss advantages, limitations and suitability for different approaches and levels of description. PMID:24467201

  11. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  12. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    NASA Astrophysics Data System (ADS)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  13. Analysis of single events in ultrarelativistic nuclear collisions: A new method to search for critical fluctuations

    SciTech Connect

    Stock, R.

    1995-07-15

    The upcoming generation of experiments with ultrarelativistic heavy nuclear projectiles, at the CERN SPS and at RHIC and LHC, will confront researchers with several thousand identified hadrons per event, suitable detectors provided. An analysis of individual events becomes meaningful concerning a multitude of hadronic signals thought to reveal a transient deconfinement phase transition, or the related critical precursor fluctuations. Transverse momentum spectra, the kaon to pion ratio, and pionic Bose-Einstein correlation are examined, showing how to separate the extreme, probably rare candidate events from the bulk of average events. This type of observables can already be investigated with the Pb beam of the SPS. The author then discusses single event signals that add to the above at RHIC and LHC energies, kaon interferometry, rapidity fluctuation, jet and {gamma} production.

  14. Human Reliability Analysis for Small Modular Reactors

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  15. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. PMID:26928768

  16. [Analysis of the impact of two typical air pollution events on the air quality of Nanjing].

    PubMed

    Wang, Fei; Zhu, Bin; Kang, Han-Qing; Gao, Jin-Hui; Wang, Yin; Jiang, Qi

    2012-10-01

    Nanjing and the surrounding area have experienced two consecutive serious air pollution events from late October to early November in 2009. The first event was long-lasting haze pollution, and the second event was resulted from the mixed impact of crop residue burning and local transportation. The effects of regional transport and local sources on the two events were discussed by cluster analysis, using surface meteorological observations, air pollution index, satellite remote sensing of fire hot spots data and back trajectory model. The results showed that the accumulation-mode aerosol number concentrations were higher than those of any other aerosol modes in the two pollution processes. The peak value of aerosol particle number concentrations shifted to large particle size compare with the previous studies in this area. The ratio of SO4(2-)/NO3(-) was 1.30 and 0.99, indicating that stationary sources were more important than traffic sources in the first event and the reverse in the second event. Affected by the local sources from east and south, the particle counts below 0.1 microm gradually accumulated in the first event. The second event was mainly affected by a short-distance transport from northeast and local sources from southwest, especially south, the concentration of aerosol particles was higher than those in other directions, indicating that the sources of crop residue burning were mainly in this direction. PMID:23234001

  17. Seasonality analysis of hydrological characteristics and flash flood events in Greece

    NASA Astrophysics Data System (ADS)

    Koutroulis, A. G.; Tsanis, I. K.

    2009-04-01

    The seasonality of flash flood occurrence is strongly connected to the climate forcing mechanisms of each region. Hydrological characteristics such as precipitation and stream flow depict the regional climate mechanisms. Comparison of daily and mean monthly seasonality of selected precipitation and runoff characteristics reveals valuable information within the context of flood occurrence. The present study presents the preliminary findings of the seasonality analysis of flash flood events that occurred in Greece during the 1925 - 2007 period in combination with a seasonality analysis of their hydrological characteristics. A two level approach at national (Greece) and regional (Crete Island) level was followed, using a total of 206 flood events. Twenty two of these flood events enriched the European Flash Flood database, which is being developed in the HYDRATE project. The analysis of hydrological characteristics through seasonality indices was based on a dataset of 83 monthly and daily precipitation stations and additionally 22 monthly and 15 daily flow stations. Analysis concludes that on the island of Crete, the flood event-based seasonality coincides with the seasonality of the daily precipitation maxima during December and January. The seasonality of the 3 largest long term daily precipitation maxima indicates that 50% of the maximum precipitation events occur during and the November -December - January (NDJ) period. The event based seasonality analysis for Greece indicated that 57% of the events occur during the NDJ period. The annual maximum daily precipitation is lagging behind by approximately one month to the maximum annual stream flows for Crete. This is due to the snow melting process, the low soil percolation rates of winter period and the high baseflow of the local karstic aquifers that contribute to the maximum flows. The results will be compared with six different hydrometeorological regions within Europe in the frame of HYDRATE project, in order to

  18. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  19. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-05-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analyzing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multi-channel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multi-component waveforms into the ray-centered co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, i.e. microseismic events for which only one of the S- or P-wave arrival is evident due to unfavorable S/N conditions. A real-data example using microseismic monitoring data from 4 stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than four-fold increase) in the number of located events compared with the original catalog. Moreover, analysis of the new MFA catalog suggests that this approach leads to more robust interpretation of the

  20. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  1. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2016-04-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The cluster recorded during the deployment of the temporary seismic network EGELADOS consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. The network itself consisted of 56 onshore and 23 offshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. It was deployed from September 2005 to March 2007 and it covered the entire HSZ. Here, both spatial and temporal clustering of the recorded events is studied by using the three component similarity analysis. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The results are shown as a function of frequency for individual stations and as averaged values over the network. The cross-correlation coefficients at the single stations show a decreasing similarity with increasing epicentral distance as well as the effect of local heterogeneities at particular stations, causing noticeable differences in waveform similarities. Event relocation was performed by using the double-difference earthquake relocation software HypoDD and the results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential travel times obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, portion and occurrence time of the aftershocks is varied, it is shown that the event

  2. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  3. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  4. EVENT TREE ANALYSIS AT THE SAVANNAH RIVER SITE: A CASE HISTORY

    SciTech Connect

    Williams, R

    2009-05-25

    At the Savannah River Site (SRS), a Department of Energy (DOE) installation in west-central South Carolina there is a unique geologic stratum that exists at depth that has the potential to cause surface settlement resulting from a seismic event. In the past the particular stratum in question has been remediated via pressure grouting, however the benefits of remediation have always been debatable. Recently the SRS has attempted to frame the issue in terms of risk via an event tree or logic tree analysis. This paper describes that analysis, including the input data required.

  5. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    SciTech Connect

    Lisbeth A. Mitchell

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  6. Complete dose analysis of the November 12, 1960 solar cosmic ray event.

    PubMed

    Masley, A J; Goedeke, A D

    1963-01-01

    A detailed analysis of the November 12, 1960 solar cosmic ray event is presented as an integrated space flux and dose. This event is probably the most interesting solar cosmic ray event studied to date. Direct measurements were made of solar protons from 10 MeV to 6 GeV. During the double peaked high energy part of the event evidence is presented for the trapping of relativistic particles in a magnetic cloud. The proton energy spectrum is divided into 3 energy intervals, with separate energy power law exponents and time profiles carried through for each. The three groups are: (1) (30analysis are the results of rocket measurements which determined the spectrum down to 10 MeV twice during the event, balloon results from Fort Churchill and Minneapolis, earth satellite measurements, neutron monitors in New Hampshire and at both the North and South Pole and riometer results from Alaska and Kiruna, Sweden. The results are given in Table 1 [see text]. The results of our analyses of other solar cosmic ray events are also included with a general discussion of the solar flare hazards in space. PMID:12056429

  7. FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

    SciTech Connect

    Griffin, G.B.

    1988-12-01

    The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.

  8. Analysis of Pressurized Water Reactor Primary Coolant Leak Events Caused by Thermal Fatigue

    SciTech Connect

    Atwood, Corwin Lee; Shah, Vikram Naginbhai; Galyean, William Jospeh

    1999-09-01

    We present statistical analyses of pressurized water reactor (PWR) primary coolant leak events caused by thermal fatigue, and discuss their safety significance. Our worldwide data contain 13 leak events (through-wall cracking) in 3509 reactor-years, all in stainless steel piping with diameter less than 25 cm. Several types of data analysis show that the frequency of leak events (events per reactor-year) is increasing with plant age, and the increase is statistically significant. When an exponential trend model is assumed, the leak frequency is estimated to double every 8 years of reactor age, although this result should not be extrapolated to plants much older than 25 years. Difficulties in arresting this increase include lack of quantitative understanding of the phenomena causing thermal fatigue, lack of understanding of crack growth, and difficulty in detecting existing cracks.

  9. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    SciTech Connect

    Attrill, Gemma D. R.

    2010-07-20

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  10. Human factors analysis and design methods for nuclear waste retrieval systems. Volume IV. Computerized Event-Tree Analysis Technique

    SciTech Connect

    Deretsky, Z.; Casey, S.M.

    1980-08-01

    This document contains a program listing and brief description of CETAT, the Computerized Event-Tree Analysis Technique. CETAT was developed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities associated with the tasks required during the retrieval of spent fuel canisters. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks.

  11. Measurements and data analysis of suburban development impacts on runoff event characteristics and unit hydrographs

    NASA Astrophysics Data System (ADS)

    Sillanpää, Nora; Koivusalo, Harri

    2014-05-01

    Urbanisation strongly changes the catchment hydrological response to rainfall. Monitoring data on hydrological variables are most commonly available from rural and large areas, but less so from urban areas, and rarely from small catchments undergoing hydrological changes during the construction processes associated with urban development. Moreover, changes caused by urbanisation in the catchment hydrological response to snowmelt have not been widely studied. In this study, the changes occurring in runoff generation were monitored in a developing catchment under construction and in two urban control catchments. The developing catchment experienced extreme change from forest to a suburban residential area. The data used included rainfall and runoff observations from a five-year period (the years 2001-2006) with 2 to 10 minute temporal resolution. In total, 636 and 239 individual runoff events were investigated for summer and winter conditions, respectively. The changes occurring in runoff event characteristics such as event runoff volumes, peak flow rates, mean runoff intensities, and volumetric runoff coefficients were identified by the means of exploratory data analysis and nonparametric comparison tests (the Kruskall-Wallis and the Mann-Whitney tests). The effect of urbanization on event runoff dynamics was investigated using instantaneous unit hydrographs (IUH) based on a two-parameter gamma distribution. The measurements and data analyses demonstrated how the impact of urbanization on runoff was best detected based on peak flow rates, volumetric runoff coefficients, and mean runoff intensities. Control catchments were essential to distinguish the hydrological impact caused by catchment characteristics from those caused by changes in the meteorological conditions or season. As the imperviousness of the developing catchment increased from 1.5% to 37%, significant increases were observed in event runoff depths and peak flows during rainfall-runoff events. At the

  12. Two Point Autocorrelation Analysis of Auger Highest Energy Events Backtracked in Galactic Magnetic Field

    NASA Astrophysics Data System (ADS)

    Petrov, Yevgeniy

    2009-10-01

    Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.

  13. Analysis of suprathermal proton events observed by STEREO/PLASTIC focusing on the observation of bow shock/magnetospheric events

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Popecki, M.; Klecker, B.; Kucharek, H.; Simunac, K.; Farrugia, C. J.; Luhmann, J. G.; Jian, L. K.

    2013-06-01

    The topic of suprathermal and energetic ion events upstream of the Earth's bow shock has been investigated since the late 1960's. Over the past 50 years, these events have been characterized as having energies ranging from just above the solar wind energies on through 2MeV, time spans of minutes to hours, and particle distributions ranging from field aligned to isotropic. The seed particles of these events accelerated within the magnetosphere and/or at the Earth's bow shock have been shown to be of ions originating in the magnetosphere, solar wind, as well as ions energized in other heliospheric processes (such as solar energetic particle (SEP), corotating interaction regions (CIRs), Pick-up ions, etc.). In this study we utilize STEREO/PLASTIC to examine bow shock/magnetospheric energetic proton events observed throughout 2007 in the region far upstream of the Earth's ion foreshock. To do this, we first employ an automated procedure to identify suprathermal proton events in the energy range of 4keV up to 80keV. The occurrence of events, magnetic connection to the Earth, and Compton-Getting transformed energy spectra of 66 possible STA bow shock/magnetospheric events are investigated as a function of spacecraft-Earth separation.

  14. An Event History Analysis of Teacher Attrition: Salary, Teacher Tracking, and Socially Disadvantaged Schools

    ERIC Educational Resources Information Center

    Kelly, Sean

    2004-01-01

    In this event history analysis of the 1990-1991 Schools and Staffing Survey and the 1992 Teacher Follow-up Survey, a retrospective person-year database was constructed to examine teacher attrition over the course of the teaching career. Consistent with prior research, higher teacher salaries reduced attrition, but only slightly so. Teacher…

  15. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  16. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  17. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  18. Uniting Secondary and Postsecondary Education: An Event History Analysis of State Adoption of Dual Enrollment Policies

    ERIC Educational Resources Information Center

    Mokher, Christine G.; McLendon, Michael K.

    2009-01-01

    This study, as the first empirical test of P-16 policy antecedents, reports the findings from an event history analysis of the origins of state dual enrollment policies adopted between 1976 and 2005. First, what characteristics of states are associated with the adoption of these policies? Second, to what extent do conventional theories on policy…

  19. Analysis of adverse events of sunitinib in patients treated for advanced renal cell carcinoma

    PubMed Central

    Cedrych, Ida; Jasiówka, Marek; Niemiec, Maciej; Skotnicki, Piotr

    2016-01-01

    Introduction Treatment of the metastatic stage of renal cell carcinoma is specific because classical chemotherapy is not applicable here. The treatment is mainly based on molecularly targeted drugs, including inhibitors of tyrosine kinases. In many cases the therapy takes many months, and patients often report to general practitioners due to adverse events. In this article, the effectiveness and side effects of one of these drugs are presented. The aim of the study was to analyse of the toxicity and safety of treatment with sunitinib malate in patients with clear cell renal cell carcinoma in the metastatic stage. Material and methods Adverse events were analyzed using retrospective analysis of data collected in a group of 39 patients treated in the Department of Systemic and Generalized Malignancies in the Cancer Center in Krakow, Poland. Results Toxicity of treatment affected 50% of patients. The most common side effects observed were hypertension, thrombocytopenia, stomatitis, diarrhea and weakness. Grade 3 serious adverse events according to Common Terminology Criteria for Adverse Events (CTCAE) version 4 affected up to 10% of patients. The most common serious adverse events were hypertension and fatigue. Conclusions Sunitinib malate is characterized by a particular type of toxicity. Knowledge of the types and range of adverse events of this drug is an important part of oncological and internal medicine care. PMID:27186181

  20. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  1. Analysis of extreme top event frequency percentiles based on fast probability integration

    SciTech Connect

    Staple, B.; Haskin, F.E.

    1993-10-01

    In risk assessments, a primary objective is to determine the frequency with which a collection of initiating and basic events, E{sub e} leads to some undesired top event, T. Uncertainties in the occurrence rates, x{sub t}, assigned to the initiating and basic events cause uncertainty in the top event frequency, z{sub T}. The quantification of the uncertainty in z{sub T} is an essential part of risk assessment called uncertainty analysis. In the past, it has been difficult to evaluate the extreme percentiles of output variables like z{sub T}. Analytic methods such as the method of moments do not provide estimates of output percentiles and the Monte Carlo (MC) method can be used to estimate extreme output percentiles only by resorting to large sample sizes. A promising altemative to these methods is the fast probability integration (FPI) methods. These methods approximate the integrals of multi-variate functions, representing percentiles of interest, without recourse to multi-dimensional numerical integration. FPI methods give precise results and have been demonstrated to be more efficient than MC methods for estimating extreme output percentiles. FPI allows the analyst to choose extreme percentiles of interest and perform sensitivity analyses in those regions. Such analyses can provide valuable insights as to the events driving the top event frequency response in extreme probability regions. In this paper, FPI methods are adapted a) to precisely estimate extreme top event frequency percentiles and b) to allow the quantification of sensitivity measures at these extreme percentiles. In addition, the relative precision and efficiency of alternative methods for treating lognormally distributed inputs is investigated. The methodology is applied to the top event frequency expression for the dominant accident sequence from a risk assessment of Grand Gulf nuclear power plant.

  2. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  3. Analysis of the longitudinal dependence of the downstream fluence of large solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Pacheco, Daniel; Sanahuja, Blai; Aran, Angels; Agueda, Neus; Jiggens, Piers

    2016-07-01

    Simulations of the solar energetic particle (SEP) intensity-time profiles are needed to estimate the radiation environment for interplanetary missions. At present, the physics-based models applied for such a purpose, and including a moving source of particles, are not able to model the portion of the SEP intensity enhancement occurring after the coronal/interplanetary shock crossing by the observer (a.k.a. the downstream region). This is the case, for example, of the shock-and-particle model used to build the SOLPENCO2 code. SOLPENCO2 provides the statistical modelling tool developed in the ESA/SEPEM project for interplanetary missions with synthetic SEP event simulations for virtual spacecraft located at heliocentric distances between 0.2 AU and 1.6 AU (http://dev.sepem.oma.be/). In this work we present an analysis of 168 individual SEP events observed at 1 AU from 1988 to 2013. We identify the solar eruptive phenomena associated with these SEP events, as well as the in-situ passage of interplanetary shocks. For each event, we quantify the amount of fluence accounted in the downstream region, i.e. after the passage of the shock, at the 11 SEPEM reference energy channels (i.e., from 5 to 300 MeV protons). First, from the subset of SEP events simultaneously detected by near Earth spacecraft (using SEPEM reference data) and by one of the STEREO spacecraft, we select those events for which the downstream region can be clearly determined. From the 8 selected multi-spacecraft events, we find that the western observations of each event have a minor downstream contribution than their eastern counterpart, and that the downstream-to-total fluence ratio of these events decreases as a function of the energy. Hence, there is a variation of the downstream fluence with the heliolongitude in SEP events. Based on this result, we study the variation of the downstream-to-total fluence ratios of the total set of individual events. We confirm the eastern-to-western decrease of the

  4. Analysis of Suprathermal Events Observed by STEREO/PLASTIC with a Focus on Upstream/Magnetospheric Events

    NASA Astrophysics Data System (ADS)

    Barry, J. A.; Galvin, A. B.; Popecki, M.; Ellis, L.; Klecker, B.; Lee, M. A.; Kucharek, H.

    2010-05-01

    The topic of suprathermal and energetic ion events upstream of the Earth's bow shock has been a topic of investigation since the late 1960's. Over the past 50 years these events have been characterized as having energies ranging from just above the solar wind energies on up to 2MeV, time spans of minutes to hours, and particle distribution functions ranging from field aligned to isotropic. The possible sources of these ions include magnetospheric ions and solar wind ions accelerated between the Earth's bow shock and low-frequency large amplitude waves in the ion foreshock. Also, energetic ions from other heliospheric processes (such as Solar Energetic Particle (SEP) events or Corotating Interaction Regions (CIRs)) can be further accelerated at the Earth's bow shock. Utilizing the particularly quiet solar minimum and the unique orbit of STEREO-A (STA), drifting ahead of the Earth in its heliocentric orbit, we are able to examine field-aligned upstream/magnetospheric energetic ion events in the unexamined region far upstream of the Earth's ion foreshock. Using both the PLASTIC and IMPACT instruments on board STA we have examined protons throughout 2007 in the energy range of 4keV up to 80keV. We find that the occurrence of automatically defined suprathermal events falls off with increasing STA-Earth separation. More importantly, it is shown through a crude approximation of the magnetic field via the Parker spiral that after a STA-Earth separation of about 3000Re it is unlikely that the Earth and STA will be magnetically connected. This corresponds well with the observed cutoff of the occurrence of suprathermal events with field-aligned anisotropies. The detection of upstream/magnetospheric events at these large distances from the Earth's bow shock indicates that the ions propagate relatively scatter free beyond the ion foreshock.

  5. Semiparametric Transformation Models with Random Effects for Joint Analysis of Recurrent and Terminal Events

    PubMed Central

    Zeng, Donglin; Lin, D. Y.

    2011-01-01

    Summary We propose a broad class of semiparametric transformation models with random effects for the joint analysis of recurrent events and a terminal event. The transformation models include proportional hazards/intensity and proportional odds models. We estimate the model parameters by the nonparametric maximum likelihood approach. The estimators are shown to be consistent, asymptotically normal, and asymptotically efficient. Simple and stable numerical algorithms are provided to calculate the parameter estimators and to estimate their variances. Extensive simulation studies demonstrate that the proposed inference procedures perform well in realistic settings. Applications to two HIV/AIDS studies are presented. PMID:18945267

  6. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    Energy Science and Technology Software Center (ESTSC)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  7. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    NASA Astrophysics Data System (ADS)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  10. A canonical correlation analysis based method for contamination event detection in water sources.

    PubMed

    Li, Ruonan; Liu, Shuming; Smith, Kate; Che, Han

    2016-06-15

    In this study, a general framework integrating a data-driven estimation model is employed for contamination event detection in water sources. Sequential canonical correlation coefficients are updated in the model using multivariate water quality time series. The proposed method utilizes canonical correlation analysis for studying the interplay between two sets of water quality parameters. The model is assessed by precision, recall and F-measure. The proposed method is tested using data from a laboratory contaminant injection experiment. The proposed method could detect a contamination event 1 minute after the introduction of 1.600 mg l(-1) acrylamide solution. With optimized parameter values, the proposed method can correctly detect 97.50% of all contamination events with no false alarms. The robustness of the proposed method can be explained using the Bauer-Fike theorem. PMID:27264637

  11. The Logic of Surveillance Guidelines: An Analysis of Vaccine Adverse Event Reports from an Ontological Perspective

    PubMed Central

    Courtot, Mélanie; Brinkman, Ryan R.; Ruttenberg, Alan

    2014-01-01

    Background When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. Methods and Findings Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. Conclusions By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. Availability Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero. PMID:24667848

  12. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  13. Analysis and visualization of single-trial event-related potentials.

    PubMed

    Jung, T P; Makeig, S; Westerfield, M; Townsend, J; Courchesne, E; Sejnowski, T J

    2001-11-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  14. Determination of Main Periodicities in Solar Wind and Magnetosphere Data During HILDCAAs Events Using Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    de Souza, A. M.; Echer, E.; Bolzam, M. J. A.

    2015-12-01

    The High-Intensity Long-Duration Continuous AE activity events (HILDCAAs) were first identified by Tsurutani and Gonzalez (1987), when they studied geomagnetic storms with a recovery phase longer than what is generally observed. They have used four criteria for defining the HILDCAA events, that are: First, the AE index must be 1000 nT at least once during the event; second, the event must be at last two days long; third, the AE index can not decay more than 200 nT for longer than two hours in each time; finally, the event must occurs outside of the main phase of the geomagnetic storm. Although several works have been done recently on HILCAAS, the main periodicities in solar wind and magnetosphere parameters during these events are still not well know. It is the aim of this work to determine these periods. In order to conduct this study, the global spectrum wavelet was used to determine the main periods of HILDCAA events. The 1-minute AE index and the Bz component of the interplanetary magnetic field (IMF) were used to characterize the magnetosphic and solar wind. We have used data of events that occurred between 1975 and 2011 for the AE index, and between 1995 and 2011 for Bz component of the IMF (GSE and GSM coordinates systems). During HILDCAAs events, the main periods found in the AE index were between 4 and 12 hours, corresponding to 50% of the total periods identified. For the Bz component, the main periods were ≤ 8 hours, independently of the coordinate system used. We conjecture that those periods can be associates with Alfvén waves that present periods between 1 and 10 hours. These Alfven waves are associated to coronal holes because the HILDCAAs events occur more often in the descending phase of solar cycles, when the high speed streams are dominant and it are emitted from coronal holes. Cross-wavelet analysis results between IMF Bz and AE are also presented and discussed.

  15. Final Report for Dynamic Models for Causal Analysis of Panel Data. Dynamic Analysis of Event Histories. Part III, Chapter 1.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, examines sociological research methods for the study of change. The advantages and procedures for dynamic analysis of event-history data (data giving the number, timing, and sequence of changes in a categorical dependent variable) are considered. The authors argue for grounding…

  16. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  17. Using Simple Statistical Analysis of Historical Data to Understand Wind Ramp Events

    SciTech Connect

    Kamath, C

    2010-01-29

    As renewable resources start providing an increasingly larger percentage of our energy needs, we need to improve our understanding of these intermittent resources so we can manage them better. In the case of wind resources, large unscheduled changes in the energy output, called ramp events, make it challenging to keep the load and the generation balanced. In this report, we show that simple statistical analysis of the historical data on wind energy generation can provide insights into these ramp events. In particular, this analysis can help answer questions such as the time period during the day when these events are likely to occur, the relative severity of positive and negative ramps, and the frequency of their occurrence. As there are several ways in which ramp events can be defined and counted, we also conduct a detailed study comparing different options. Our results indicate that the statistics are relatively insensitive to these choices, but depend on utility-specific factors, such as the magnitude of the ramp and the time interval over which this change occurs. These factors reflect the challenges faced by schedulers and operators in keeping the load and generation balanced and can change over the years. We conduct our analysis using data from wind farms in the Tehachapi Pass region in Southern California and the Columbia Basin region in Northern Oregon; while the results for other regions are likely to be different, the report describes the benefits of conducting simple statistical analysis on wind generation data and the insights that can be gained through such analysis.

  18. Efficacy and adverse events of cold vs hot polypectomy: A meta-analysis

    PubMed Central

    Fujiya, Mikihiro; Sato, Hiroki; Ueno, Nobuhiro; Sakatani, Aki; Tanaka, Kazuyuki; Dokoshi, Tatsuya; Fujibayashi, Shugo; Nomura, Yoshiki; Kashima, Shin; Gotoh, Takuma; Sasajima, Junpei; Moriichi, Kentaro; Watari, Jiro; Kohgo, Yutaka

    2016-01-01

    AIM: To compare previously reported randomized controlled studies (RCTs) of cold and hot polypectomy, we systematically reviewed and clarify the utility of cold polypectomy over hot with respect to efficacy and adverse events. METHODS: A meta-analysis was conducted to evaluate the predominance of cold and hot polypectomy for removing colon polyps. Published articles and abstracts from worldwide conferences were searched using the keywords “cold polypectomy”. RCTs that compared either or both the effects or adverse events of cold polypectomy with those of hot polypectomy were collected. The patients’ demographics, endoscopic procedures, No. of examined lesions, lesion size, macroscopic and histologic findings, rates of incomplete resection, bleeding amount, perforation, and length of procedure were extracted from each study. A forest plot analysis was used to verify the relative strength of the effects and adverse events of each procedure. A funnel plot was generated to assess the possibility of publication bias. RESULTS: Ultimately, six RCTs were selected. No significant differences were noted in the average lesion size (less than 10 mm) between the cold and hot polypectomy groups in each study. Further, the rates of complete resection and adverse events, including delayed bleeding, did not differ markedly between cold and hot polypectomy. The average procedural time in the cold polypectomy group was significantly shorter than in the hot polypectomy group. CONCLUSION: Cold polypectomy is a time-saving procedure for removing small polyps with markedly similar curability and safety to hot polypectomy. PMID:27340361

  19. Ontology-based time information representation of vaccine adverse events in VAERS for temporal analysis

    PubMed Central

    2012-01-01

    Background The U.S. FDA/CDC Vaccine Adverse Event Reporting System (VAERS) provides a valuable data source for post-vaccination adverse event analyses. The structured data in the system has been widely used, but the information in the write-up narratives is rarely included in these kinds of analyses. In fact, the unstructured nature of the narratives makes the data embedded in them difficult to be used for any further studies. Results We developed an ontology-based approach to represent the data in the narratives in a “machine-understandable” way, so that it can be easily queried and further analyzed. Our focus is the time aspect in the data for time trending analysis. The Time Event Ontology (TEO), Ontology of Adverse Events (OAE), and Vaccine Ontology (VO) are leveraged for the semantic representation of this purpose. A VAERS case report is presented as a use case for the ontological representations. The advantages of using our ontology-based Semantic web representation and data analysis are emphasized. Conclusions We believe that representing both the structured data and the data from write-up narratives in an integrated, unified, and “machine-understandable” way can improve research for vaccine safety analyses, causality assessments, and retrospective studies. PMID:23256916

  20. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  1. Through the eyes of the other: using event analysis to build cultural competence.

    PubMed

    Kozub, Mary L

    2013-07-01

    Cultural competence requires more than the accumulation of information about cultural groups. An awareness of the nurse's own culture, beliefs, and values is considered by several transcultural nursing theorists to be essential to the development of cultural competence and the provision of quality patient care. Using Transformational Learning Theory, this article describes event analysis, an active learning tool that uses the nurse's own practice to explore multiple perspectives of an experience, with the goal of transforming the nurse's approach to diversity from an ethnocentric stance, to one of tolerance and consideration for the patient's needs, values, and beliefs with regard to quality of care. Furthermore, the application of the event analysis to multiple settings, including inpatient, educational, and administrative environments, is discussed. PMID:23545698

  2. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  3. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  4. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation

  5. Identification and Analysis of Storm Tracks Associated with Extreme Flood Events in Southeast and South Brazil

    NASA Astrophysics Data System (ADS)

    Lima, Carlos; Lopes, Camila

    2015-04-01

    Flood is the main natural disaster in Brazil, practically affecting all regions in the country and causing several economical damages and losses of lives. In traditional hydrology, the study of floods is focused on a frequency analysis of the extreme events and on the fit of statistical models to define flood quantiles associated with pre-specified return periods or exceedance probabilities. The basic assumptions are randomness and temporal stationarity of the streamflow data. In this paper we seek to advance the traditional flood frequency studies by using the ideas developed in the area of flood hydroclimatology, which is defined as the study of climate in the flood framework, i.e., the understanding of long term changes in the frequency, magnitude, duration, location and seasonality of floods as driven by the interaction of regional and global patterns of the ocean and atmospheric circulation. That being said, flood events are not treated as random and stationary but resulting from a causal chain, where exceptional floods in water basins from different sizes are related with large scale anomalies in the atmospheric and ocean circulation patterns. Hence, such studies enrich the classical assumption of stationary flood hazard adopted in most flood frequency studies through a formal consideration of the physical mechanisms responsible for the generation of extreme floods, which implies recognizing the natural climate variability due to persistent and oscillatory regimes (e.g. ENSO, NAO, PDO) in many temporal scales (interannual, decadal, etc), and climate fluctuations in response to anthropogenic changes in the atmosphere, soil use and vegetation cover. Under this framework and based on streamflow gauge and reanalysis data, we identify and analyze here the storm tracks that preceded extreme events of floods in key flood-prone regions of the country (e.g. Parana and Rio Doce River basins) with such events defined based on the magnitude, duration and volume of the

  6. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis. PMID:22549283

  7. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  8. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  9. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    PubMed Central

    Gao, Zhu-ye; Qiu, Yu; Jiao, Yang; Shang, Qing-hua; Shi, Da-zhuo

    2013-01-01

    Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD) and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG) accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537). The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986), age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53), and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057) were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM) therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036). Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events. PMID:23983773

  10. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  11. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  12. Cardiovascular Events Following Smoke-Free Legislations: An Updated Systematic Review and Meta-Analysis

    PubMed Central

    Jones, Miranda R.; Barnoya, Joaquin; Stranges, Saverio; Losonczy, Lia; Navas-Acien, Ana

    2014-01-01

    Background Legislations banning smoking in indoor public places and workplaces are being implemented worldwide to protect the population from secondhand smoke exposure. Several studies have reported reductions in hospitalizations for acute coronary events following the enactment of smoke-free laws. Objective We set out to conduct a systematic review and meta-analysis of epidemiologic studies examining how legislations that ban smoking in indoor public places impact the risk of acute coronary events. Methods We searched MEDLINE, EMBASE, and relevant bibliographies including previous systematic reviews for studies that evaluated changes in acute coronary events, following implementation of smoke-free legislations. Studies were identified through December 2013. We pooled relative risk (RR) estimates for acute coronary events comparing post- vs. pre-legislation using inverse-variance weighted random-effects models. Results Thirty-one studies providing estimates for 47 locations were included. The legislations were implemented between 1991 and 2010. Following the enactment of smoke-free legislations, there was a 12 % reduction in hospitalizations for acute coronary events (pooled RR: 0.88, 95 % CI: 0.85–0.90). Reductions were 14 % in locations that implemented comprehensive legislations compared to an 8 % reduction in locations that only had partial restrictions. In locations with reductions in smoking prevalence post-legislation above the mean (2.1 % reduction) there was a 14 % reduction in events compared to 10 % in locations below the mean. The RRs for acute coronary events associated with enacting smoke-free legislation were 0.87 vs. 0.89 in locations with smoking prevalence pre-legislation above and below the mean (23.1 %), and 0.87 vs. 0.89 in studies from the Americas vs. other regions. Conclusion The implementation of smoke-free legislations was related to reductions in acute coronary event hospitalizations in most populations evaluated. Benefits are greater

  13. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q‑ qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  14. Detailed chronological analysis of microevolution events in herds infected persistently by Mycobacterium bovis.

    PubMed

    Navarro, Yurena; Romero, Beatriz; Bouza, Emilio; Domínguez, Lucas; de Juan, Lucía; García-de-Viedma, Darío

    2016-02-01

    Various studies have analyzed microevolution events leading to the emergence of clonal variants in human infections by Mycobacterium tuberculosis. However, microevolution events in animal tuberculosis remain unknown. We performed a systematic analysis of microevolution events in eight herds that were chronically infected by Mycobacterium bovis for more than 12 months. We analyzed 88 animals using a systematic screening procedure based on discriminatory MIRU-VNTR genotyping at sequential time points during the infection. Microevolution was detected in half of the herds studied. Emergence of clonal variants did not require long infection periods or a high number of infected animals in the herd. Microevolution was not restricted to strains from specific spoligotypes, and the subtle variations detected involved different MIRU loci. The genetic locations of the subtle genotypic variations recorded in the clonal variants indicated potential functional significance. This finding was consistent with the dynamics of some clonal variants, which outcompeted the original strains, suggesting an advantageous phenotype. Our data constitute a first step in defining the thresholds of variability to be tolerated in molecular epidemiology studies of M. bovis. We could therefore ensure that related clonal variants emerging as a result of microevolution events are not going to be misinterpreted as unrelated isolates. PMID:26790941

  15. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    PubMed Central

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  16. Urbanization and Fertility: An Event-History Analysis of Coastal Ghana

    PubMed Central

    WHITE, MICHAEL J.; MUHIDIN, SALUT; ANDRZEJEWSKI, CATHERINE; TAGOE, EVA; KNIGHT, RODNEY; REED, HOLLY

    2008-01-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself. Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  17. Urbanization and fertility: an event-history analysis of coastal Ghana.

    PubMed

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  18. Mixed-effects Poisson regression analysis of adverse event reports: the relationship between antidepressants and suicide.

    PubMed

    Gibbons, Robert D; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K; Bhaumik, Dulal K; Brown, C Hendricks; Kapur, Kush; Marcus, Sue M; Hur, Kwan; Mann, J John

    2008-05-20

    A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)'s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  19. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  20. Application of satellite remote-sensing data for source analysis of fine particulate matter transport events.

    PubMed

    Engel-Cox, Jill A; Young, Gregory S; Hoff, Raymond M

    2005-09-01

    Satellite sensors have provided new datasets for monitoring regional and urban air quality. Satellite sensors provide comprehensive geospatial information on air quality with both qualitative imagery and quantitative data, such as aerosol optical depth. Yet there has been limited application of these new datasets in the study of air pollutant sources relevant to public policy. One promising approach to more directly link satellite sensor data to air quality policy is to integrate satellite sensor data with air quality parameters and models. This paper presents a visualization technique to integrate satellite sensor data, ground-based data, and back trajectory analysis relevant to a new rule concerning the transport of particulate matter across state boundaries. Overlaying satellite aerosol optical depth data and back trajectories in the days leading up to a known fine particulate matter with an aerodynamic diameter of <2.5 microm (PM2.5) event may indicate whether transport or local sources appear to be most responsible for high PM2.5 levels in a certain location at a certain time. Events in five cities in the United States are presented as case studies. This type of analysis can be used to help understand the source locations of pollutants during specific events and to support regulatory compliance decisions in cases of long distance transport. PMID:16259433

  1. Final Report for Dynamic Models for Causal Analysis of Panel Data. Approaches to the Censoring Problem in Analysis of Event Histories. Part III, Chapter 2.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…

  2. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  3. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events

    PubMed Central

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic “GT-AG” rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  4. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events.

    PubMed

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic "GT-AG" rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  5. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables.

    PubMed

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an "event of relation" with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  6. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables

    PubMed Central

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an “event of relation” with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  7. Diagnostic analysis and spectral energetics of a blocking event in the GLAS climate model simulation

    NASA Technical Reports Server (NTRS)

    Chen, T.-C.; Shukla, J.

    1983-01-01

    A synoptic and spectral analysis of a blocking event is presented, with attention given to the temporal evolution, maintenance, and decay of the block. The GLAS numerical climate model was used to generate a blocking event by the introduction of SST anomalies. Wavenumbers 2 and 3 became stationary around their climatological locations, and their constructive interference produced persistent blocking ridges over the west coast of North America and the other over western Europe. Time variations of the kinetic and potential energies and energy conversions during the blocking were performed. Spectrally filtered Hovmoller diagrams were developed for the winter of 1976-77, and showed that long waves were stationary over most of the interval, which featured severe weather conditions.

  8. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  9. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    NASA Astrophysics Data System (ADS)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  10. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26997353

  11. Computational analysis reveals a correlation of exon-skipping events with splicing, transcription and epigenetic factors.

    PubMed

    Ye, Zhenqing; Chen, Zhong; Lan, Xun; Hara, Stephen; Sunkel, Benjamin; Huang, Tim H-M; Elnitski, Laura; Wang, Qianben; Jin, Victor X

    2014-03-01

    Alternative splicing (AS), in higher eukaryotes, is one of the mechanisms of post-transcriptional regulation that generate multiple transcripts from the same gene. One particular mode of AS is the skipping event where an exon may be alternatively excluded or constitutively included in the resulting mature mRNA. Both transcript isoforms from this skipping event site, i.e. in which the exon is either included (inclusion isoform) or excluded (skipping isoform), are typically present in one cell, and maintain a subtle balance that is vital to cellular function and dynamics. However, how the prevailing conditions dictate which isoform is expressed and what biological factors might influence the regulation of this process remain areas requiring further exploration. In this study, we have developed a novel computational method, graph-based exon-skipping scanner (GESS), for de novo detection of skipping event sites from raw RNA-seq reads without prior knowledge of gene annotations, as well as for determining the dominant isoform generated from such sites. We have applied our method to publicly available RNA-seq data in GM12878 and K562 cells from the ENCODE consortium and experimentally validated several skipping site predictions by RT-PCR. Furthermore, we integrated other sequencing-based genomic data to investigate the impact of splicing activities, transcription factors (TFs) and epigenetic histone modifications on splicing outcomes. Our computational analysis found that splice sites within the skipping-isoform-dominated group (SIDG) tended to exhibit weaker MaxEntScan-calculated splice site strength around middle, 'skipping', exons compared to those in the inclusion-isoform-dominated group (IIDG). We further showed the positional preference pattern of splicing factors, characterized by enrichment in the intronic splice sites immediately bordering middle exons. Finally, our analysis suggested that different epigenetic factors may introduce a variable obstacle in the

  12. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    SciTech Connect

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. )

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  13. Uncertainty Analysis of Climate Change Impact on Extreme Rainfall Events in the Apalachicola River Basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, D.; Hagen, S.; Bacopoulos, P.

    2011-12-01

    Climate change impact on the rainfall patterns during the summer season (May -- August) at the Apalachicola River basin (Florida Panhandle coast) is assessed using ensemble regional climate models (RCMs). Rainfall data for both baseline and future years (30-year periods) are obtained from North American Regional Climate Change Assessment Program (NARCCAP) where the A2 emission scenario is used. Trend analysis is conducted based on historical rainfall data from three weather stations. Two methods are used to assess the climate change impact on the rainfall intensity-duration-frequency (IDF) curves, i.e., maximum intensity percentile-based method and sequential bias correction and maximum intensity percentile-based method. As a preliminary result from one RCM, extreme rainfall intensity is found to increase significantly with the increase in rainfall intensity increasing more dramatically with closer proximity to the coast. The projected rainfall pattern changes (spatial and temporal, mean and extreme values) provide guidance for developing adaptation and mitigation strategies on water resources management and ecosystem protections. More rainfall events move from July to June during future years for all three stations; in the upstream, the variability of time occurrence of extreme rainfall increases and more extreme events are shown to occur in June and August instead of May. These temporal shifts of extreme rainfall events will increase the probability of simultaneous heavy rainfall in the downstream and upstream in June during which flooding will be enhanced. The uncertainty analysis on the climate change impact on extreme rainfall events will be presented based on the simulations from the ensemble of RCMs.

  14. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  15. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  16. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model. PMID:24194116

  17. An event-related analysis of P300 by simultaneous EEG/fMRI

    NASA Astrophysics Data System (ADS)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  18. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event.

    PubMed

    Bogaard, Glynis; Meijer, Ewout H; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  19. Recent developments in methods of chemical analysis in investigations of firearm-related events.

    PubMed

    Zeichner, Arie

    2003-08-01

    A review of recent (approximately the last ten years) developments in the methods used for chemical analysis in investigations of firearm-related events is provided. This review discusses:examination of gunshot (primer) residues (GSR) and gunpowder (propellant) residues on suspects and their clothing;detection of firearm imprints on the hands of suspects;identification of the bullet entry holes and estimation of shooting distance;linking weapons and/or fired ammunition to the gunshot entries, and estimation of the time since discharge. PMID:12811451

  20. Using Fluctuation Analysis to Establish Causal Relations between Cellular Events without Experimental Perturbation

    PubMed Central

    Welf, Erik S.; Danuser, Gaudenz

    2014-01-01

    Experimental perturbations are commonly used to establish causal relationships between the molecular components of a pathway and their cellular functions; however, this approach suffers inherent limitations. Especially in pathways with a significant level of nonlinearity and redundancy among components, such perturbations induce compensatory responses that obscure the actual function of the targeted component in the unperturbed pathway. A complementary approach uses constitutive fluctuations in component activities to identify the hierarchy of information flow through pathways. Here, we review the motivation for using perturbation-free approaches and highlight recent advances made in using perturbation-free fluctuation analysis as a means to establish causality among cellular events. PMID:25468328

  1. High fold computer disk storage DATABASE for fast extended analysis of γ-rays events

    NASA Astrophysics Data System (ADS)

    Stézowski, O.; Finck, Ch.; Prévost, D.

    1999-03-01

    Recently spectacular technical developments have been achieved to increase the resolving power of large γ-ray spectrometers. With these new eyes, physicists are able to study the intricate nature of atomic nuclei. Concurrently more and more complex multidimensional analyses are needed to investigate very weak phenomena. In this article, we first present a software (DATABASE) allowing high fold coincidences γ-rays events to be stored on hard disk. Then, a non-conventional method of analysis, anti-gating procedure, is described. Two physical examples are given to explain how it can be used and Monte Carlo simulations have been performed to test the validity of this method.

  2. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event

    PubMed Central

    Bogaard, Glynis; Meijer, Ewout H.; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  3. Recurrent event data analysis with intermittently observed time-varying covariates.

    PubMed

    Li, Shanshan; Sun, Yifei; Huang, Chiung-Yu; Follmann, Dean A; Krause, Richard

    2016-08-15

    Although recurrent event data analysis is a rapidly evolving area of research, rigorous studies on estimation of the effects of intermittently observed time-varying covariates on the risk of recurrent events have been lacking. Existing methods for analyzing recurrent event data usually require that the covariate processes are observed throughout the entire follow-up period. However, covariates are often observed periodically rather than continuously. We propose a novel semiparametric estimator for the regression parameters in the popular proportional rate model. The proposed estimator is based on an estimated score function where we kernel smooth the mean covariate process. We show that the proposed semiparametric estimator is asymptotically unbiased, normally distributed, and derives the asymptotic variance. Simulation studies are conducted to compare the performance of the proposed estimator and the simple methods carrying forward the last covariates. The different methods are applied to an observational study designed to assess the effect of group A streptococcus on pharyngitis among school children in India. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26887664

  4. Analysis of recurrent events with an associated informative dropout time: Application of the joint frailty model.

    PubMed

    Rogers, Jennifer K; Yaroshinsky, Alex; Pocock, Stuart J; Stokar, David; Pogoda, Janice

    2016-06-15

    This paper considers the analysis of a repeat event outcome in clinical trials of chronic diseases in the context of dependent censoring (e.g. mortality). It has particular application in the context of recurrent heart failure hospitalisations in trials of heart failure. Semi-parametric joint frailty models (JFMs) simultaneously analyse recurrent heart failure hospitalisations and time to cardiovascular death, estimating distinct hazard ratios whilst individual-specific latent variables induce associations between the two processes. A simulation study was carried out to assess the suitability of the JFM versus marginal analyses of recurrent events and cardiovascular death using standard methods. Hazard ratios were consistently overestimated when marginal models were used, whilst the JFM produced good, well-estimated results. An application to the Candesartan in Heart failure: Assessment of Reduction in Mortality and morbidity programme was considered. The JFM gave unbiased estimates of treatment effects in the presence of dependent censoring. We advocate the use of the JFM for future trials that consider recurrent events as the primary outcome. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:26751714

  5. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  6. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  7. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    SciTech Connect

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  8. Analysis of phreatic events at Ruapehu volcano, New Zealand using a new SOM approach

    NASA Astrophysics Data System (ADS)

    Carniel, Roberto; Jolly, Arthur D.; Barbui, Luca

    2013-03-01

    We apply Self-Organising Maps (SOM) to assess the low level seismic activity prior to small scale phreatic events at Ruapehu volcano New Zealand. The SOM approach allows an automatic pattern recognition, virtually independent from a priori knowledge. Volcanic tremor spectra are randomly presented to the network in a competitive iterative training process, followed by a hierarchical clusterization of the SOM nodes. Spectra are then projected, ordered by time, to clusters on the map. A coherent time evolution of the data through the clusters can highlight the existence of different regimes and the transitions between them. Two Ruapehu events were examined: a phreatic event on 4 October 2006 which displaced the crater lake producing a 4 m high wave on the lake edge, and the more energetic 25 September 2007 phreatic eruption. The SOM analysis provides a classification of tremor spectral patterns that clusters into three regimes that we label by colours. The pattern for both eruptions is consistent with a pre-eruption spectral pattern including enhanced spectral energy in the range of 4 to 6 Hz — labelled 'green tremor'. This gives way to spectra having broader energy between 2 and 6 Hz, the so called 'red tremor' just prior to the eruption. The post eruption pattern includes spectral peaks at generally lower frequencies of 2 to 4 Hz — the so called 'blue tremor'. Clusterization into only three groups yields highly non-unique solutions which cannot explain the variety of processes operating at Ruapehu over long time periods. Regardless, the approach highlights noteworthy similarities that may be explained by a pattern of slow pressurisation under a hydrothermal or magmatic seal - 'green' - followed by seal failure - 'red' - and subsequent de-pressurisation - 'blue' - for the two events studied. Although the application shown here is limited, we think it demonstrates the power of this classification approach.

  9. Post Tyrrhénian deformation analysis in the Sahel coast (Eastern Tunisia): seismotectonic events implication

    NASA Astrophysics Data System (ADS)

    Mejrei, H.; Ghribi, R.; Bouaziz, S.; Balescu, S.

    2012-04-01

    The eastern coast of Tunisia is characterized by Pleistocene coastal deposits considered as a reference of interglacial high sea levels. In this region, the stratigraphy of Tunisian Pleistocene deposits was first established on the basis of geomorphological, lithostratigraphic, biostratigraphic criteria and U/Th data. They have been subdivided into three superimposed formations, from the oldest to the recent "Douira, Rejiche and Chebba" including coastal marine (Strombus bubonius), lagoonal and eolian sediments. These marine formations are organized into parallel bars to the actual shoreline overlaying unconformably the Mio-Pliocene and "Villafranchian" deposits. A luminescence dating method IRSL applied to alkali feldspar grains from the two sandy marines units of the Douira formation demonstrate for the first time the presence of two successive interglacial high sea level events correlative of MIS 7 and MIS 9. These sandy marine units are separated by a major erosional surface and by a continental pedogenised loamy deposit related to a low sea level event which might be assigned to MIS 8. Variations in the height of these marine unit (+13 to +32m) in the Sahel coast reflect a significant tectonic deformations and show precious geomorphological and tectonic markers. An extensive brittle deformations analysis has been carried out in several sites. A detailed analysis of fracturing is based on studies of fault-slip data population and of joint sets. It allows reconstructions of post Tyrrhenian stress regimes which are characterized by N170-016 compression and N095-100 extension. In this paper we present, the combination of IRSL data applied to these raised marine deposits and a reconstruction of tectonic evolution in term of stress pattern evolution since the Tyrrhenian allowed us to assign an accurate the recent tectonic calendar. These reconstituted events will be replaced and will be discussed in the regional setting of sismotectonic activities of the north

  10. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    NASA Astrophysics Data System (ADS)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  11. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  12. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    NASA Astrophysics Data System (ADS)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  13. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  14. One-stage parametric meta-analysis of time-to-event outcomes

    PubMed Central

    Siannis, F; Barrett, J K; Farewell, V T; Tierney, J F

    2010-01-01

    Methodology for the meta-analysis of individual patient data with survival end-points is proposed. Motivated by questions about the reliance on hazard ratios as summary measures of treatment effects, a parametric approach is considered and percentile ratios are introduced as an alternative to hazard ratios. The generalized log-gamma model, which includes many common time-to-event distributions as special cases, is discussed in detail. Likelihood inference for percentile ratios is outlined. The proposed methodology is used for a meta-analysis of glioma data that was one of the studies which motivated this work. A simulation study exploring the validity of the proposed methodology is available electronically. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20963770

  15. Analysis of Loss-of-Offsite-Power Events 1998–2013

    SciTech Connect

    Schroeder, John Alton

    2015-02-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses suggest that loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2013. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The emergency diesel generator failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. No statistically significant trends in LOOP frequencies over the 1997–2013 period are identified. There is a possibility that a significant trend in grid-related LOOP frequency exists that is not easily detected by a simple analysis. Statistically significant increases in recovery times after grid- and switchyard-related LOOPs are identified.

  16. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  17. Modeling propensity to move after job change using event history analysis and temporal GIS

    NASA Astrophysics Data System (ADS)

    Vandersmissen, Marie-Hélène; Séguin, Anne-Marie; Thériault, Marius; Claramunt, Christophe

    2009-03-01

    The research presented in this paper analyzes the emergent residential behaviors of individual actors in a context of profound social changes in the work sphere. It incorporates a long-term view in the analysis of the relationships between social changes in the work sphere and these behaviors. The general hypothesis is that social changes produce complex changes in the long-term dynamics of residential location behavior. More precisely, the objective of this paper is to estimate the propensity for professional workers to move house after a change of workplace. Our analysis draws on data from a biographical survey using a retrospective questionnaire that enables a posteriori reconstitution of the familial, professional and residential lifelines of professional workers since their departure from their parents’ home. The survey was conducted in 1996 in the Quebec City Metropolitan Area, which, much like other Canadian cities, has experienced a substantial increase in “unstable” work, even for professionals. The approach is based on event history analysis, a Temporal Geographic Information System and exploratory spatial analysis of model’s residuals. Results indicate that 48.9% of respondents moved after a job change and that the most important factors influencing the propensity to move house after a job change are home tenure (for lone adults as for couple) and number of children (for couples only). We also found that moving is associated with changing neighborhood for owners while tenants or co-tenants tend to stay in the same neighborhood. The probability of moving 1 year after a job change is 0.10 for lone adults and couples while after 2 years, the household structure seems to have an impact: the probability increased to 0.23 for lone adults and to 0.21 for couples. The outcome of this research contributes to furthering our understanding of a familial decision (to move) following a professional event (change of job), controlling for household structure

  18. Broadband Array Analysis of the 2005 Episodic Tremor and Slip Event in Northern Cascadia

    NASA Astrophysics Data System (ADS)

    Wech, A.; Creager, K.; McCausland, W.; Frassetto, A.; Qamar, A.; Derosier, S.; Carmichael, J.; Malone, S.; Johnson, D.

    2005-12-01

    The region of Cascadia from the Olympic Mountains through southern Vancouver Island and down-dip of the subduction megathrust has repeatedly experienced episodes of slow slip. This episodic slip, which has been observed to take place over a period of two to several weeks, is accompanied by a seismic tremor signal. Based on the average recurrence interval of 14 months, the next episodic tremor and slip (ETS) event should occur within six weeks of mid-September, 2005. Indeed, it appears to have begun on September 3, as this abstract was being written. In order to record this anticipated event, we deployed an array of 11 three-component seismometers on the northern side of the Olympic Peninsula augmenting Pacific Northwest Seismographic Network stations as well as the first few EarthScope BigFoot stations and Plate Boundary Observatory borehole seismometers. This seismic array was comprised of six short-period and five broadband instruments with spacings of 500 m and 2200 m respectively. In conjunction with this Earthscope seismic deployment, we also installed a dense network of 29 temporary, continuous GPS stations across the entire Olympic Peninsula to integrate seismic and geodetic observations. One of the primary goals of this research is to utilize the broadband instrumentation in the array to investigate the possible correlation of low frequency energy with the rest of the tremor activity. ETS has been carefully investigated at high-frequency (seismic tremor at 2-6 Hz) and very low-frequency (slip occurring over weeks, observed by GPS). An important goal of this experiment is to investigate the possibility that the tremor generates intermediate, low-frequency signals. Preliminary analysis of short-period array recordings of the July, 2004 ETS event suggests that the tremor displays signs of lower-frequency energy (~0.5 Hz) correlated with its higher frequency activity. Our array should enable us to distinguish low- frequency signals originating in the direction

  19. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  20. Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.

    2013-12-01

    Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web

  1. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment

  2. Regional frequency analysis for mapping drought events in north-central Chile

    NASA Astrophysics Data System (ADS)

    Núñez, J. H.; Verbist, K.; Wallis, J. R.; Schaefer, M. G.; Morales, L.; Cornelis, W. M.

    2011-08-01

    SummaryDroughts are among the most important natural disasters, particularly in the arid and semiarid regions of the world. Proper management of droughts requires knowledge of the expected frequency of specific low magnitude precipitation totals for a variety of durations. Probabilistic approaches have often been used to estimate the average recurrence period of a given drought event. However, probabilistic model fitting by conventional methods, such as product moment or maximum likelihood in areas with low availability of long records often produces highly unreliable estimates. Recognizing the need for adequate estimates of return periods of severe droughts in the arid and semiarid region of Chile, a regional frequency analysis method based on L-moments (RFA-LM) was used for estimating and mapping drought frequency. Some adaptations to the existing procedures for forming homogeneous regions were found necessary. In addition, a new 3-parameter distribution, the Gaucho, which is a special case of the 4-parameter Kappa distribution, was introduced, and the analysis procedure was improved by the developments of two new software tools named L-RAP, to perform the RFA-LM analysis, and L-MAP, to map the resulting drought maps. Eight homogeneous sub-regions were delineated using the Gaucho distribution and used to construct return period maps for drought events with 80% and 40% precipitation of the normal. The study confirms the importance of a sub-regional homogeneity test, and the usefulness of the Gaucho distribution. The RFA-LM showed that droughts with a 40% precipitation of the normal have return periods that range from 4 years at the northern arid boundary of the study area to 22 years at the southern sub-humid boundary. The results demonstrate the need for different thresholds for declaring a drought than those currently in use for drought characterization in north-central Chile.

  3. Reinvestigation and analysis a landslide dam event in 2012 using UAV

    NASA Astrophysics Data System (ADS)

    Wang, Kuo-Lung; Huang, Zji-Jie; Lin, Jun-Tin

    2015-04-01

    Geological condition of Taiwan is fracture with locating on Pacific Rim seismic area. Typhoons usually attack during summer and steep mountains are highly weathered, which induces landslide in mountain area. The situation happens more frequently recent years due to weather change effect. Most landslides are very far away from residence area. Field investigation is time consuming, high budget, limited data collected and dangerous. Investigation with satellite images has disadvantages such as less of the actual situation and poor resolution. Thus the possibility for slope investigation with UAV will be proposed and discussed in this research. Hazard investigation and monitoring is adopted UAV in recent years. UAV has advantages such as light weight, small volume, high mobility, safe, easy maintenance and low cost. Investigation can be executed in high risk area. Use the mature aero photogrammetry , combines aero photos with control point. Digital surface model (DSM) and Ortho photos can be produced with control points aligned. The resolution can be less than 5cm thus can be used as temporal creeping monitoring before landslide happens. A large landslide site at 75k of road No. 14 was investigated in this research. Landslide happened in June, 2012 with heavy rainfall and landslide dam was formed quickly after that. Analysis of this landslide failure and mechanism were discussed in this research using DEMs produced prior this event with aero photos and after this event with UAV. Residual slope stability analysis is thus carried out after strength parameters obtain from analysis described above. Thus advice for following potential landslide conditions can be provided.

  4. Paraesthesia after local anaesthetics: an analysis of reports to the FDA Adverse Event Reporting System.

    PubMed

    Piccinni, Carlo; Gissi, Davide B; Gabusi, Andrea; Montebugnoli, Lucio; Poluzzi, Elisabetta

    2015-07-01

    This study was aimed to evaluate the possible alert signals of paraesthesia by local anaesthetics, focusing on those used in dentistry. A case/non-case study of spontaneous adverse events recorded in FAERS (FDA Adverse Event Reporting System) between 2004 and 2011 was performed. Cases were represented by the reports of reactions grouped under the term 'Paraesthesias and dysaesthesias' involving local anaesthetics (ATC: N01B*); non-cases were all other reports of the same drugs. Reporting odds ratios (ROR) with the relevant 95% confidence intervals (95CI) were calculated. Alert signal was considered when number of cases >3 and lower limit of ROR 95CI > 1. To estimate the specificity of signals for dentistry, the analysis was restricted to the specific term "Oral Paraesthesia" and to reports concerning dental practice. Overall, 528 reports of 'Paraesthesias and dysaesthesias' were retrieved, corresponding to 573 drug-reaction pairs (247 lidocaine, 99 bupivacaine, 85 articaine, 30 prilocaine, 112 others). The signal was significant only for articaine (ROR=18.38; 95CI = 13.95-24.21) and prilocaine (2.66; 1.82-3.90). The analysis of the specific term "Oral Paraesthesia" retrieved 82 reports corresponding to 90 drug-reaction pairs (37 articaine, 19 lidocaine, 14 prilocaine, 7 bupivacaine, 13 others) and confirmed the signal for articaine (58.77; 37.82-91.31) and prilocaine (8.73; 4.89-15.57). The analysis of reports concerning dental procedures retrieved a signal for articaine, both for any procedures (8.84; 2.79-27.97) and for non-surgical ones (15.79; 1.87-133.46). In conclusion, among local anaesthetics, only articaine and prilocaine generated a signal of paraesthesia, especially when used in dentistry. PMID:25420896

  5. Analysis of a snowfall event produced by mountains waves in Guadarrama Mountains (Spain)

    NASA Astrophysics Data System (ADS)

    Gascón, Estíbaliz; Sánchez, José Luis; Fernández-González, Sergio; Merino, Andrés; López, Laura; García-Ortega, Eduardo

    2014-05-01

    Heavy snowfall events are fairly uncommon precipitation processes in the Iberian Peninsula. When large amounts of snow accumulate in large cities with populations that are unaccustomed to or unprepared for heavy snow, these events have a major impact on their daily activities. On 16 January 2013, an extreme snowstorm occurred in Guadarrama Mountains (Madrid, Spain) during an experimental winter campaign as a part of the TECOAGUA Project. Strong northwesterly winds, high precipitation and temperatures close to 0°C were detected throughout the whole day. During this episode, it was possible to continuously take measurements of different variables involved in the development of the convection using a multichannel microwave radiometer (MMWR). The significant increase in the cloud thickness observed vertically by the MMWR and registered precipitation of 43 mm in 24 hours at the station of Navacerrada (Madrid) led us to consider that we were facing an episode of strong winter convection. Images from the Meteosat Second Generation (MSG) satellite suggested that the main source of the convection was the formation of mountain waves on the south face of the Guadarrama Mountains. The event was simulated in high resolution using the WRF mesoscale model, an analysis of which is based on the observational simulations and data. Finally, the continuous measurements obtained with the MMWR allowed us to monitor the vertical situation above the Guadarrama Mountains with temporal resolution of 2 minutes. This instrument has a clear advantage in monitoring short-term episodes of this kind in comparison to radiosondes, which usually produce data at 0000 and 1200 UTC. Acknowledgements This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22). The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2.

  6. Quantitative fibrosis estimation by image analysis predicts development of decompensation, composite events and defines event-free survival in chronic hepatitis B patients.

    PubMed

    Bihari, Chhagan; Rastogi, Archana; Sen, Bijoya; Bhadoria, Ajeet Singh; Maiwall, Rakhi; Sarin, Shiv K

    2016-09-01

    The extent of fibrosis is a major determinant of the clinical outcome in patients with chronic liver diseases. We undertook this study to explore the degree of fibrosis in baseline liver biopsies to predict clinical outcomes in chronic hepatitis B (CHB) patients. Fibrosis quantification was done by image analysis on Masson's trichrome-stained sections and correlated with clinical and biochemical parameters, liver stiffness and hepatic vein pressure gradient (n = 96). Follow-up information collected related to clinical outcome. A total of 964 cases was analyzed. Median quantitative fibrosis (QF) was 3.7% (interquartile range, 1.6%-9.7%) with substantial variation in various stages. Median QF was F0, 1% (0.7%-1.65%); F1, 3.03% (2.07%-4.0%); F2, 7.1% (5.6%-8.7%); F3, 12.7% (10.15%-16.7%); F4, 26.9% (20.3%-36.4%). QF positively correlated with METAVIR staging, liver stiffness measurement, and hepatic vein pressure gradient. Eighty-nine cases developed liver-related events: decompensation, hepatocellular carcinoma, liver transplantation and death. Cox regression analysis after adjusting for METAVIR staging-QF, albumin, and AST for composite events; QF and albumin for decompensation; and only QF for hepatocellular carcinoma-were found to be significant predictors of clinical outcomes. QF categorized into five stages: QF1, 0%-5%; QF2, 5.1%-10%; QF3, 10.1%-15%; QF4, 15.1%-20%; QF5, >20.1%. In patients with advanced stages of QF, probability of event-free survival found to be low. Quantitative fibrosis in baseline liver biopsy predicts progression of the disease and disease outcome in CHB patients. QF defines the probability of event-free survival in CHB cases. PMID:27189343

  7. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  8. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    NASA Astrophysics Data System (ADS)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  9. Analysis of Loss-of-Offsite-Power Events 1998–2012

    SciTech Connect

    T. E. Wierman

    2013-10-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses performed loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience from fiscal year 1998 through 2012. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The EDG failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. A statistically significant increase in industry performance was identified for plant-centered and switchyard-centered LOOP frequencies. There is no statistically significant trend in LOOP durations.

  10. Comparing evapotranspiration partitioning after different types of rain events using stable isotopes and lagrangian dispersion analysis

    NASA Astrophysics Data System (ADS)

    Hogan, Patrick; Parajka, Juraj

    2016-04-01

    The eddy covariance method has become one of the standard methods for measuring evapotranspiration (ET) at the field scale, however it cannot separate transpiration from evaporation and it is also limited within plant canopies due to distortion of the turbulent wind fields. Possible solutions to these limitations include combining EC measurements made above the canopy coupled with either source/sink distribution models or stable isotope ET partitioning models. During the summer of 2014 the concentration and isotopic ratio of water vapour within the canopy of a growing maize field at the Hydrological Open Air Laboratory (HOAL) catchment was measured using a Picarro field sampling device. A tripod mounted eddy covariance device was used to calculate the ET value for the field. The first objective of this experiment is to compare the ET partitioning results made using the stable isotope Keeling Plot method within a canopy to two different lagrangian dispersion analysis methods, the Localised Near Field theory of Raupach (1989a) and the Warland and Thurtell (2000) dispersion model. Preliminary results show good agreement during dry conditions with the dispersion methods overestimating the fraction of transpiration directly after a rain event. The second objective is then to analyse and compare the soil evaporation response for two different kinds of rain events using the stable isotope results.

  11. Kickoff to Conflict: A Sequence Analysis of Intra-State Conflict-Preceding Event Structures

    PubMed Central

    D'Orazio, Vito; Yonamine, James E.

    2015-01-01

    While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998–2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace. PMID:25951105

  12. Observations and Analysis of Mutual Events between the Uranus Main Satellites

    NASA Astrophysics Data System (ADS)

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; da Silva Neto, D. N.; Andrei, A. H.

    2009-04-01

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s-1, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites. Based on observations made at Laboratório Nacional de Astrofísica (LNA), Itajubá-MG, Brazil.

  13. Forecasting and nowcasting process: A case study analysis of severe precipitation event in Athens, Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis; Nastos, Panagiotis; Avgoustoglou, Euripides; Gofa, Flora; Pytharoulis, Ioannis; Kamberakis, Nikolaos

    2016-04-01

    An early warning process is the result of interplay between the forecasting and nowcasting interactions. Therefore, (1) an accurate measurement and prediction of the spatial and temporal distribution of rainfall over an area and (2) the efficient and appropriate description of the catchment properties are important issues in atmospheric hazards (severe precipitation, flood, flash flood, etc.). In this paper, a forecasting and nowcasting analysis is presented, regarding a severe precipitation event that took place on September 21, 2015 in Athens, Greece. The severe precipitation caused a flash flood event at the suburbs of Athens, with significant impacts to the local society. Quantitative precipitation forecasts from European Centre for Medium-Range Weather Forecasts and from the COSMO.GR atmospheric model, including ensemble forecast of precipitation and probabilistic approaches are analyzed as tools in forecasting process. Satellite remote sensing data close and six hours prior to flash flood are presented, accompanied with radar products from Hellenic National Meteorological Service, illustrating the ability to depict the convection process.

  14. ERPLAB: an open-source toolbox for the analysis of event-related potentials

    PubMed Central

    Lopez-Calderon, Javier; Luck, Steven J.

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741

  15. Large solar energetic particle event that occurred on 2012 March 7 and its VDA analysis

    NASA Astrophysics Data System (ADS)

    Ding, Liu-Guan; Cao, Xin-Xin; Wang, Zhi-Wei; Le, Gui-Ming

    2016-08-01

    On 2012 March 7, the STEREO Ahead and Behind spacecraft, along with near-Earth spacecraft (e.g. SOHO, Wind) situated between the two STEREO spacecraft, observed an extremely large global solar energetic particle (SEP) event in Solar Cycle 24. Two successive coronal mass ejections (CMEs) have been detected close in time. From the multi-point in-situ observations, it can be found that this SEP event was caused by the first CME, but the second one was not involved. Using velocity dispersion analysis (VDA), we find that for a well magnetically connected point, the energetic protons and electrons are released nearly at the same time. The path lengths to STEREO-B (STB) for protons and electrons have a distinct difference and deviate remarkably from the nominal Parker spiral path length, which is likely due to the presence of interplanetary magnetic structures situated between the source and STB. Also, the VDA method seems to only obtain reasonable results at well-connected locations and the inferred release times of energetic particles in different energy channels are similar. We suggest that good-connection is crucial for obtaining both an accurate release time and path length simultaneously, agreeing with the modeling result of Wang & Qin (2015).

  16. Prediction of Heart Failure Decompensation Events by Trend Analysis of Telemonitoring Data.

    PubMed

    Henriques, J; Carvalho, P; Paredes, S; Rocha, T; Habetha, J; Antunes, M; Morais, J

    2015-09-01

    This paper aims to assess the predictive value of physiological data daily collected in a telemonitoring study in the early detection of heart failure (HF) decompensation events. The main hypothesis is that physiological time series with similar progression (trends) may have prognostic value in future clinical states (decompensation or normal condition). The strategy is composed of two main steps: a trend similarity analysis and a predictive procedure. The similarity scheme combines the Haar wavelet decomposition, in which signals are represented as linear combinations of a set of orthogonal bases, with the Karhunen-Loève transform, that allows the selection of the reduced set of bases that capture the fundamental behavior of the time series. The prediction process assumes that future evolution of current condition can be inferred from the progression of past physiological time series. Therefore, founded on the trend similarity measure, a set of time series presenting a progression similar to the current condition is identified in the historical dataset, which is then employed, through a nearest neighbor approach, in the current prediction. The strategy is evaluated using physiological data resulting from the myHeart telemonitoring study, namely blood pressure, respiration rate, heart rate, and body weight collected from 41 patients (15 decompensation events and 26 normal conditions). The obtained results suggest, in general, that the physiological data have predictive value, and in particular, that the proposed scheme is particularly appropriate to address the early detection of HF decompensation. PMID:25248206

  17. OBSERVATIONS AND ANALYSIS OF MUTUAL EVENTS BETWEEN THE URANUS MAIN SATELLITES

    SciTech Connect

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; Da Silva Neto, D. N.; Andrei, A. H. E-mail: rvm@on.br

    2009-04-15

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s{sup -1}, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites.

  18. Classifying onset durations of early VLF events: Scattered field analysis and new insights

    NASA Astrophysics Data System (ADS)

    Kotovsky, D. A.; Moore, R. C.

    2015-08-01

    The physical processes responsible for a variety of early VLF scattering events have not yet been satisfactorily identified. Properly categorizing the early VLF event type is imperative to understand the causative physical processes involved. In this paper, the onset durations of 26 exceptionally high signal-to-noise ratio early VLF scattering events are analyzed, using scattered fields to classify events. New observations of events that exhibit "slow" amplitude changes, but "fast" scattered field changes are presented, which call into question previous analyses of early/slow events. We separately identify and analyze three early VLF events that definitively exhibit slow scattered field behavior. Additionally, we identify a significant number of events which have onset durations between the current definitions of fast and slow. Four events are observed which unambiguously exhibit a rapid initial rotation of the scattered field phasor during the first few seconds of the recovery stage. Possible physical mechanisms are discussed.

  19. Error Analysis of Satellite Precipitation-Driven Modeling of Complex Terrain Flood Events

    NASA Astrophysics Data System (ADS)

    Mei, Y.; Nikolopoulos, E. I.; Anagnostou, E. N.; Zoccatelli, D.; Borga, M., Sr.

    2015-12-01

    The error characteristics of satellite precipitation driven flood event simulations over mountainous basins are evaluated in this study for eight different global satellite products. A methodology is devised to match the observed records of the flood events with the corresponding satellite and reference rainfall and runoff simulations. The flood events are sorted according to flood type (i.e. rain flood and flash flood) and basin's antecedent conditions represented by the event's runoff-to-precipitation ratio. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. cumulative volume, timing and shape). Overall satellite-driven event runoff exhibits better error metrics than the satellite precipitation. Better error metrics are also shown for the rain flood events relative to the flash flood events. The event timing and shape from satellite-derived precipitation agreed well with the reference; the cumulative volume is mostly underestimated. In terms of error propagation, the study shows dampening effect in both systematic and random error components of the satellite-driven runoff time series relative to the satellite-retrieved event precipitation. This error dampening effect is less pronounced for the flash flood events and the rain flood events with high runoff coefficients. This study provides for a first time flood event characteristics of the satellite precipitation error propagation in flood modeling, which has implications on the Global Precipitation Measurement application in mountain flood hydrology.

  20. Voluntary electronic reporting of laboratory errors: an analysis of 37,532 laboratory event reports from 30 health care organizations.

    PubMed

    Snydman, Laura K; Harubin, Beth; Kumar, Sanjaya; Chen, Jack; Lopez, Robert E; Salem, Deeb N

    2012-01-01

    Laboratory testing is essential for diagnosis, evaluation, and management. The objective was to describe the type of laboratory events reported in hospitals using a voluntary electronic error reporting system (e-ERS) via a cross-sectional analysis of reported laboratory events from 30 health organizations throughout the United States (January 1, 2000, to December 31, 2005). A total of 37,532 laboratory-related events were reported, accounting for 14.1% of all reported quality events. Preanalytic laboratory events were the most common (81.1%); the top 3 were specimen not labeled (18.7%), specimen mislabeled (16.3%), and improper collection (13.2%). A small number (0.08%) of laboratory events caused permanent harm or death; 8% caused temporary harm. Most laboratory events (55%) did not cause harm. Laboratory errors constitute 1 of 7 quality events. Laboratory errors often are caused by events that precede specimen arrival in the lab and should be preventable with a better labeling processes and education. Most laboratory errors do not lead to patient harm. PMID:21918013

  1. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  2. Hydroacoustic monitoring of a salt cavity: analysis of precursory events of the collapse

    NASA Astrophysics Data System (ADS)

    Lebert, François; Bernardie, Séverine; Mainsant, Guénolé

    2010-05-01

    One of the main purposes in "post mining" research is related to the available methods and means for monitoring mine-degradation processes that may, as a consequence, directly threaten surface infrastructures. GISOS, a French scientific interest group concerned with the impact and the safety of the underground works in the field of the post-mining, aims amongst other at developing techniques for monitoring underground growing cavities due to salt dissolution, leading to collapse. One method for monitoring the stability of a salt cavity is to record microseismic-precursor signals that indicate the onset of rock failure. In particular, in this study, it has to identify and evaluate the capacity of hydroacoustic technique for monitoring salt cavities. More specifically, the purpose is to be able to determine the criteria of the behaviour change and the state of the rock likely to occur as a precursory sign before the collapse of the salt cavity. More precisely, three types of signal were recorded in a salt mine, in Lorraine (France), during the monitoring of the collapse of a salt cavity of about 800.000 m3 at 120 m depth. - The RMS (Root Mean Square) levels, with the time recordings of the RMS power in four frequency-bands (total signal; 30 Hz - 3 kHz; 3 kHz - 30 kHz; 30 kHz - 180 kHz). - The low frequency monitoring, which records the events from cracking to block falls, in the 30 Hz - 3 kHz frequency-band? - The high frequency monitoring, which deals with the recordings of events occurring in the 30 kHz - 180 kHz frequency-band? The hydroacoustic data highlight some interesting precursory signals before the collapse of the cavity. Indeed, the cumulative energy evolution of both low and high frequency events seems to be a good indicator of the mechanical state of the cavity. Moreover, the analysis of the recordings shows a new type of family events, which occurs a few hours before the failure phase. Finally, correlations have been performed between hydroacoustic

  3. Hydroacoustic monitoring of a salt cavity: analysis of precursory events of the collapse

    NASA Astrophysics Data System (ADS)

    Bernardie, S.; Lebert, F.; Mainsant, G.

    2009-12-01

    One of the main purposes in "post mining" research is related to the available methods and means for monitoring mine-degradation processes that may, as a consequence, directly threaten surface infrastructures. GISOS, a French scientific interest group concerned with the impact and the safety of the underground works in the field of the post-mining, aims amongst other at developing techniques for monitoring underground growing cavities due to salt dissolution, leading to collapse. One method for monitoring the stability of a salt cavity is to record microseismic-precursor signals that indicate the onset of rock failure. In particular, in this study, it has to identify and evaluate the capacity of hydroacoustic technique for monitoring salt cavities. More specifically, the purpose is to be able to determine the criteria of the behaviour change and the state of the rock likely to occur as a precursory sign before the collapse of the salt cavity. More precisely, three types of signal are investigated: - The RMS (Root Mean Square) levels, with the time recordings of the RMS power in four frequency-bands (total signal; 30 Hz - 3 kHz; 3 kHz - 30 kHz; 30 kHz - 180 kHz). - The low frequency monitoring, which records the events from cracking to block falls, in the 30 Hz - 3 kHz frequency-band. - The high frequency monitoring, which deals with the recordings of events occurring in the 30 kHz - 180 kHz frequency-band. The hydroacoustic data highlight some interesting precursory signals before the collapse of the cavity. Indeed, the cumulative energy evolution of both low and high frequency events seems to be a good indicator of the mechanical state of the cavity. Moreover, the analysis of the recordings shows a new type of family events, which occurs a few hours before the failure phase. Finally, correlations have been performed between hydroacoustic recordings and other measurements acquired at the same time on the site, including strain measurements, and hydrostatic pressure

  4. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  5. Localization of the event-related potential novelty response as defined by principal components analysis.

    PubMed

    Dien, Joseph; Spencer, Kevin M; Donchin, Emanuel

    2003-10-01

    Recent research indicates that novel stimuli elicit at least two distinct components, the Novelty P3 and the P300. The P300 is thought to be elicited when a context updating mechanism is activated by a wide class of deviant events. The functional significance of the Novelty P3 is uncertain. Identification of the generator sources of the two components could provide additional information about their functional significance. Previous localization efforts have yielded conflicting results. The present report demonstrates that the use of principal components analysis (PCA) results in better convergence with knowledge about functional neuroanatomy than did previous localization efforts. The results are also more convincing than that obtained by two alternative methods, MUSIC-RAP and the Minimum Norm. Source modeling on 129-channel data with BESA and BrainVoyager suggests the P300 has sources in the temporal-parietal junction whereas the Novelty P3 has sources in the anterior cingulate. PMID:14561451

  6. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    SciTech Connect

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  7. Migration Experience and Premarital Sexual Initiation in Urban Kenya: An Event History Analysis

    PubMed Central

    Luke, Nancy; Xu, Hongwei; Mberu, Blessing U.; Goldberg, Rachel E.

    2013-01-01

    Migration during the formative adolescent years can affect important life-course transitions, including the initiation of sexual activity. In this study, we use life history calendar data to investigate the relationship between changes in residence and timing of premarital sexual debut among young people in urban Kenya. By age 18, 64 percent of respondents had initiated premarital sex, and 45 percent had moved at least once between the ages of 12 and 18. Results of the event history analysis show that girls and boys who move during early adolescence experience the earliest onset of sexual activity. For adolescent girls, however, other dimensions of migration provide protective effects, with greater numbers of residential changes and residential changes in the last one to three months associated with later sexual initiation. To support young people’s ability to navigate the social, economic, and sexual environments that accompany residential change, researchers and policymakers should consider how various dimensions of migration affect sexual activity. PMID:23175950

  8. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    NASA Technical Reports Server (NTRS)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  9. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    SciTech Connect

    Anderson, Johan; Halpern, Federico D.; Ricci, Paolo; Furno, Ivo; Xanthopoulos, Pavlos

    2014-12-15

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis of the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.

  10. Efficacy of forensic statement analysis in distinguishing truthful from deceptive eyewitness accounts of highly stressful events.

    PubMed

    Morgan, Charles A; Colwell, Kevin; Hazlett, Gary A

    2011-09-01

    Laboratory-based detecting deception research suggests that truthful statements differ from those of deceptive statements. This nonlaboratory study tested whether forensic statement analysis (FSA) methods would distinguish genuine from false eyewitness accounts about exposure to a highly stressful event. A total of 35 military participants were assigned to truthful or deceptive eyewitness conditions. Genuine eyewitness reported truthfully about exposure to interrogation stress. Deceptive eyewitnesses studied transcripts of genuine eyewitnesses for 24 h and falsely claimed they had been interrogated. Cognitive Interviews were recorded, transcribed, and assessed by FSA raters blind to the status of participants. Genuine accounts contained more unique words, external and contextual referents, and a greater total word count than did deceptive statements. The type-token ratio was lower in genuine statements. The classification accuracy using FSA techniques was 82%. FSA methods may be effective in real-world circumstances and have relevance to professionals in law enforcement, security, and criminal justice. PMID:21854383

  11. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    NASA Astrophysics Data System (ADS)

    Trigo, Ricardo; Varino, Filipa; Ramos, Alexandre; Valente, Maria; Zêzere, José; Vaquero, José; Gouveia, Célia; Russo, Ana

    2014-04-01

    The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora), present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over central Atlantic Ocean.

  12. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains.

    PubMed

    Torre, Emiliano; Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz; Grün, Sonja

    2016-07-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  13. Uncertainty Analysis for a De-pressurised Loss of Forced Cooling Event of the PBMR Reactor

    SciTech Connect

    Jansen van Rensburg, Pieter A.; Sage, Martin G.

    2006-07-01

    This paper presents an uncertainty analysis for a De-pressurised Loss of Forced Cooling (DLOFC) event that was performed with the systems CFD (Computational Fluid Dynamics) code Flownex for the PBMR reactor. An uncertainty analysis was performed to determine the variation in maximum fuel, core barrel and reactor pressure vessel (RPV) temperature due to variations in model input parameters. Some of the input parameters that were varied are: thermo-physical properties of helium and the various solid materials, decay heat, neutron and gamma heating, pebble bed pressure loss, pebble bed Nusselt number and pebble bed bypass flows. The Flownex model of the PBMR reactor is a 2-dimensional axisymmetrical model. It is simplified in terms of geometry and some other input values. However, it is believed that the model adequately indicates the effect of changes in certain input parameters on the fuel temperature and other components during a DLOFC event. Firstly, a sensitivity study was performed where input variables were varied individually according to predefined uncertainty ranges and the results were sorted according to the effect on maximum fuel temperature. In the sensitivity study, only seven variables had a significant effect on the maximum fuel temperature (greater that 5 deg. C). The most significant are power distribution profile, decay heat, reflector properties and effective pebble bed conductivity. Secondly, Monte Carlo analyses were performed in which twenty variables were varied simultaneously within predefined uncertainty ranges. For a one-tailed 95% confidence level, the conservatism that should be added to the best estimate calculation of the maximum fuel temperature for a DLOFC was determined as 53 deg. C. This value will probably increase after some model refinements in the future. Flownex was found to be a valuable tool for uncertainly analyses, facilitating both sensitivity studies and Monte Carlo analyses. (authors)

  14. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  15. Assessment of (sub-) seasonal prediction skill using a canonical event analysi

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Wood, E. F.

    2015-12-01

    Hydrological extremes regularly occur in all regions of the world and as such are globally relevant phenomena with large impacts on society. Seasonal and sub-seasonal predictions could increase the preparedness to these extreme events. We investigated the skill of five seasonal forecast models from the NMME-II ensemble for the period 1982-2012 at a range of temporal and spatial scales. A canonical event analysis is used to enable a model validation beyond the ¨single¨ temporal and spatial scale. The model predictions are compared to two reference datasets on the seasonal and sub-seasonal scale. We evaluate their capability to reproduce observed daily precipitation and temperature. It is shown that the skill of the models is largely dependent on the temporal aggregation and the lead time. Longer temporal aggregation increases the forecast skill of both precipitation and temperature. Seasonal precipitation forecasts show no skill beyond lead time of 6 months, while seasonal temperature forecasts skill does extent beyond the 6 months. Overall the highest skill can be found over South-America and Australia, whereas the skill over Europe and North-America is relatively low for both variables. On the sub-seasonal scale (two week aggregation) we find a strong decrease in prediction skill after the first 2 weeks of initialization. However, the models retain skill up to 1-2 months for precipitation and 3-4 months for temperature. Their skill is highest in South-America, Asia and Oceania at the sub-seasonal level. The skill amongst models differs greatly for both the sub-seasonal and seasonal forecasts, indicating that a (weighted) multi-model ensemble is preferred over single model forecasts. This work shows that an analysis at multiple temporal and spatial scales can enhance our understanding of the added value of (sub-) seasonal forecast models and their applicability, which is important when these models are applied to forecasting of (hydrological) extremes.

  16. Predictors of Adverse Events for Ankle Fractures: An Analysis of 6800 Patients.

    PubMed

    Dodd, Ashley C; Lakomkin, Nikita; Attum, Basem; Bulka, Catherine; Karhade, Aditya V; Douleh, Diana G; Mir, Hassan; Jahangir, A Alex; Obremskey, William T; Sethi, Manish K

    2016-01-01

    Ankle fractures are one of the most common injuries seen by orthopedic surgeons. It is therefore essential to understand the risks associated with their treatment. Using the American College of Surgeons National Surgical Quality Improvement Program(®) database from 2006 to 2013, the patient demographics, comorbidities, and 30-day complications were collected for 5 types of ankle fractures. A bivariate analysis was used to compare the patient demographics, comorbidities, and complications across all Common Procedural Terminology codes. A multivariable logistic regression model was then used to assess the odds of minor and major postoperative complications within 30 days after open treatment. A total of 6865 patients were included in the analysis. Of these patients, 2507 (36.5%) had bimalleolar ankle fractures. The overall rate of adverse events for ankle fractures was low. Bimalleolar fractures had the greatest rate of major (2.6%, n = 64), minor (3.8%, n = 94), and total (5.7%, n = 143) complications. When controlling for individual patient characteristics, bimalleolar fractures were associated with 4.92 times the odds (95% confidence interval 1.80 to 13.5; p = .002) of developing a complication compared with those with a medial malleolar fracture. The risk factors driving postoperative complications for all ankle fractures were age >65 years, obesity, diabetes, American Society of Anesthesiologists score >2, and functional status (p < .05). Although the overall rate of adverse events for ankle fractures was low, bimalleolar fractures were associated with 5 times the odds of developing a complication compared with medial malleolar fractures. Orthopedic surgeons must be aware of the risk factors that increase the rate of ankle fracture complications to improve patients' quality of care. PMID:27086177

  17. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  18. Top-down and bottom-up definitions of human failure events in human reliability analysis

    SciTech Connect

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  19. Potential of breastmilk analysis to inform early events in breast carcinogenesis: rationale and considerations.

    PubMed

    Murphy, Jeanne; Sherman, Mark E; Browne, Eva P; Caballero, Ana I; Punska, Elizabeth C; Pfeiffer, Ruth M; Yang, Hannah P; Lee, Maxwell; Yang, Howard; Gierach, Gretchen L; Arcaro, Kathleen F

    2016-05-01

    This review summarizes methods related to the study of human breastmilk in etiologic and biomarkers research. Despite the importance of reproductive factors in breast carcinogenesis, factors that act early in life are difficult to study because young women rarely require breast imaging or biopsy, and analysis of critical circulating factors (e.g., hormones) is often complicated by the requirement to accurately account for menstrual cycle date. Accordingly, novel approaches are needed to understand how events such as pregnancy, breastfeeding, weaning, and post-weaning breast remodeling influence breast cancer risk. Analysis of breastmilk offers opportunities to understand mechanisms related to carcinogenesis in the breast, and to identify risk markers that may inform efforts to identify high-risk women early in the carcinogenic process. In addition, analysis of breastmilk could have value in early detection or diagnosis of breast cancer. In this article, we describe the potential for using breastmilk to characterize the microenvironment of the lactating breast with the goal of advancing research on risk assessment, prevention, and detection of breast cancer. PMID:27107568

  20. Analysis and Prediction of West African Moist Events during the Boreal Spring of 2009

    NASA Astrophysics Data System (ADS)

    Mera, Roberto Javier

    Weather and climate in Sahelian West Africa are dominated by two major wind systems, the southwesterly West African Monsoon (WAM) and the northeasterly (Harmattan) trade winds. In addition to the agricultural benefit of the WAM, the public health sector is affected given the relationship between the onset of moisture and end of meningitis outbreaks. Knowledge and prediction of moisture distribution during the boreal spring is vital to the mitigation of meningitis by providing guidance for vaccine dissemination. The goal of the present study is to (a) develop a climatology and conceptual model of the moisture regime during the boreal spring, (b) investigate the role of extra-tropical and Convectively-coupled Equatorial Waves (CCEWs) on the modulation of westward moving synoptic waves and (c) determine the efficacy of a regional model as a tool for predicting moisture variability. Medical reports during 2009, along with continuous meteorological observations at Kano, Nigeria, showed that the advent of high humidity correlated with cessation of the disease. Further analysis of the 2009 boreal spring elucidated the presence of short-term moist events that modulated surface moisture on temporal scales relevant to the health sector. The May moist event (MME) provided insight into interplays among climate anomalies, extra-tropical systems, equatorially trapped waves and westward-propagating synoptic disturbances. The synoptic disturbance initiated 7 May and traveled westward to the coast by 12 May. There was a marked, semi-stationary moist anomaly in the precipitable water field (kg m-2) east of 10°E through late April and early May, that moved westward at the time of the MME. Further inspection revealed a mid-latitude system may have played a role in increasing the latitudinal amplitude of the MME. CCEWs were also found to have an impact on the MME. A coherent Kelvin wave propagated through the region, providing increased monsoonal flow and heightened convection. A

  1. Frequency analysis and its spatiotemporal characteristics of precipitation extreme events in China during 1951-2010

    NASA Astrophysics Data System (ADS)

    Shao, Yuehong; Wu, Junmei; Ye, Jinyin; Liu, Yonghe

    2015-08-01

    This study investigates frequency analysis and its spatiotemporal characteristics of precipitation extremes based on annual maximum of daily precipitation (AMP) data of 753 observation stations in China during the period 1951-2010. Several statistical methods including L-moments, Mann-Kendall test (MK test), Student's t test ( t test) and analysis of variance ( F-test) are used to study different statistical properties related to frequency and spatiotemporal characteristics of precipitation extremes. The results indicate that the AMP series of most sites have no linear trends at 90 % confidence level, but there is a distinctive decrease trend in Beijing-Tianjin-Tangshan region. The analysis of abrupt changes shows that there are no significant changes in most sites, and no distinctive regional patterns within the mutation sites either. An important innovation different from the previous studies is the shift in the mean and the variance which are also studied in this paper in order to further analyze the changes of strong and weak precipitation extreme events. The shift analysis shows that we should pay more attention to the drought in North China and to the flood control and drought in South China, especially to those regions that have no clear trend and have a significant shift in the variance. More important, this study conducts the comprehensive analysis of a complete set of quantile estimates and its spatiotemporal characteristic in China. Spatial distribution of quantile estimation based on the AMP series demonstrated that the values gradually increased from the Northwest to the Southeast with the increment of duration and return period, while the increasing rate of estimation is smooth in the arid and semiarid region and is rapid in humid region. Frequency estimates of 50-year return period are in agreement with the maximum observations of AMP series in the most stations, which can provide more quantitative and scientific basis for decision making.

  2. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  3. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    SciTech Connect

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. ); Baxter, J.T. ); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. ); Brosseau, D.A. )

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  4. Exact meta-analysis approach for discrete data and its application to 2 × 2 tables with rare events

    PubMed Central

    Liu, Dungang; Liu, Regina Y.

    2014-01-01

    This paper proposes a general exact meta-analysis approach for synthesizing inferences from multiple studies of discrete data. The approach combines the p-value functions (also known as significance functions) associated with the exact tests from individual studies. It encompasses a broad class of exact meta-analysis methods, as it permits broad choices for the combining elements, such as tests used in individual studies, and any parameter of interest. The approach yields statements that explicitly account for the impact of individual studies on the overall inference, in terms of efficiency/power and the type I error rate. Those statements also give rises to empirical methods for further enhancing the combined inference. Although the proposed approach is for general discrete settings, for convenience, it is illustrated throughout using the setting of meta-analysis of multiple 2 × 2 tables. In the context of rare events data, such as observing few, zero or zero total (i.e., zero events in both arms) outcomes in binomial trials or 2 × 2 tables, most existing meta-analysis methods rely on the large-sample approximations which may yield invalid inference. The commonly used corrections to zero outcomes in rare events data, aiming to improve numerical performance can also incur undesirable consequences. The proposed approach applies readily to any rare event setting, including even the zero total event studies without any artificial correction. While debates continue on whether or how zero total event studies should be incorporated in meta-analysis, the proposed approach has the advantage of automatically including those studies and thus making use of all available data. Through numerical studies in rare events settings, the proposed exact approach is shown to be efficient and, generally, outperform commonly used meta-analysis methods, including Mental-Haenszel and Peto methods. PMID:25620825

  5. Rain-on-snow Events in Southwestern British Columbia: A Long-term Analysis of Meteorological Conditions and Snowpack Response

    NASA Astrophysics Data System (ADS)

    Trubilowicz, J. W.; Moore, D.

    2015-12-01

    Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.

  6. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    SciTech Connect

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  7. Retrospective Analysis of Recent Flood Events With Persistent High Surface Runoff From Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Joshi, S.; Hakeem, K. Abdul; Raju, P. V.; Rao, V. V.; Yadav, A.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    /locations with probable flooding conditions. These thresholds were refined through iterative process by comparing with satellite data derived flood maps of 2013 and 2014 monsoon season over India. India encountered many cyclonic flood events during Oct-Dec 2013, among which Phailin, Lehar, and Madi were rated to be very severe cyclonic storm. The path and intensity of these cyclonic events was very well captured by the model and areas were marked with persistent coverage of high runoff risk/flooded area. These thresholds were used to monitor floods in Jammu Kashmir during 4-5 Sep and Odisha during 8-9 Aug, 2014. The analysis indicated the need to vary the thresholds across space considering the terrain and geographical conditions. With respect to this a sub-basin wise study was made based on terrain characteristics (slope, elevation) using Aster DEM. It was found that basins with higher elevation represent higher thresholds as compared to basins with lesser elevation. The results show very promising correlation with the satellite derived flood maps. Further refinement and optimization of thresholds, varying them spatially accounting for topographic/terrain conditions, would lead to estimation of high runoff/flood risk areas for both riverine and drainage congested areas. Use of weather forecast data (NCMWRF, (GEFS/R)), etc. would enhance the scope to develop early warning systems.

  8. Loss Modeling with a Data-Driven Approach in Event-Based Rainfall-Runoff Analysis

    NASA Astrophysics Data System (ADS)

    Chua, L. H. C.

    2012-04-01

    is completely impervious and the losses are small. Thus, the good agreement of results between the ANN with the KW model results demonstrates the applicability of the ANN model in modeling the loss rate. Comparing the modeled runoff with the measured runoff for the Upper Bukit Timah catchment, it was found that the KW model was not able to produce the runoff from the catchment accurately due to the improper prescription of the loss rate. This is because the loss rate varies over a wide range of values in a real catchment and using the loss rate for an average event did not provide truly representative values for the catchment. Although the same dataset was used in the training of the ANN model, the ANN model was able to produce hydrographs with significantly higher Nash-Sutcliffe coefficients compared to the KW model. This analysis demonstrates that the ANN model is better able to model the highly variable loss rate during storm events, especially if the data used for calibration is limited. ACKNOWLEDGEMENT Funding received from the DHI-NTU Water & Environment Research Centre and Education Hub is gratefully acknowledged.

  9. Full Moment Tensor Analysis of Western US Explosions, Earthquakes, Collapses, and Volcanic Events Using a Regional Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Ford, S. R.; Dreger, D. S.; Walter, W. R.

    2006-12-01

    Seismic moment tensor analysis at regional distances commonly involves solving for the deviatoric moment tensor and decomposing it to characterize the tectonic earthquake source. The full seismic moment tensor solution can also recover the isotropic component of the seismic source, which is theoretically dominant in explosions and collapses, and present in volcanic events. Analysis of events with demonstrably significant isotropic energy can aid in understanding the source processes of volcanic and geothermal seismic events and the monitoring of nuclear explosions. Using a regional time-domain waveform inversion for the complete moment tensor we calculate the deviatoric and isotropic source components for several explosions at the Nevada Test Site (NTS) and earthquakes, collapses, and volcanic events in the surrounding region of the NTS (Western US). The events separate into specific populations according to their deviation from a pure double-couple and ratio of isotropic to deviatoric energy. The separation allows for anomalous event identification and discrimination of explosions, earthquakes, and collapses. Analysis of the source principal axes can characterize the regional stress field, and tectonic release due to explosions. Error in the moment tensor solutions and source parameters is also calculated. We investigate the sensitivity of the moment tensor solutions to Green's functions calculated with imperfect Earth models, inaccurate event locations, and data with a low signal-to-noise ratio. We also test the performance of the method under a range of recording conditions from excellent azimuthal coverage to cases of sparse coverage as might be expected for smaller events. This analysis will be used to determine the magnitude range where well-constrained solutions can be obtained.

  10. SYSTEMS SAFETY ANALYSIS FOR FIRE EVENTS ASSOCIATED WITH THE ECRB CROSS DRIFT

    SciTech Connect

    R. J. Garrett

    2001-12-12

    The purpose of this analysis is to systematically identify and evaluate fire hazards related to the Yucca Mountain Site Characterization Project (YMP) Enhanced Characterization of the Repository Block (ECRB) East-West Cross Drift (commonly referred to as the ECRB Cross-Drift). This analysis builds upon prior Exploratory Studies Facility (ESF) System Safety Analyses and incorporates Topopah Springs (TS) Main Drift fire scenarios and ECRB Cross-Drift fire scenarios. Accident scenarios involving the fires in the Main Drift and the ECRB Cross-Drift were previously evaluated in ''Topopah Springs Main Drift System Safety Analysis'' (CRWMS M&O 1995) and the ''Yucca Mountain Site Characterization Project East-West Drift System Safety Analysis'' (CRWMS M&O 1998). In addition to listing required mitigation/control features, this analysis identifies the potential need for procedures and training as part of defense-in-depth mitigation/control features. The inclusion of this information in the System Safety Analysis (SSA) is intended to assist the organization(s) (e.g., Construction, Environmental Safety and Health, Design) responsible for these aspects of the ECRB Cross-Drift in developing mitigation/control features for fire events, including Emergency Refuge Station(s). This SSA was prepared, in part, in response to Condition/Issue Identification and Reporting/Resolution System (CIRS) item 1966. The SSA is an integral part of the systems engineering process, whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach is used which incorporates operating experiences and recommendations from vendors, the constructor and the operating contractor. The risk assessment in this analysis characterizes the scenarios associated with fires in terms of relative risk and includes recommendations for mitigating all identified hazards. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures

  11. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired. PMID:27286268

  12. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  13. 2005 Caribbean mass coral bleaching event: A sea surface temperature empirical orthogonal teleconnection analysis

    NASA Astrophysics Data System (ADS)

    Simonti, Alicia L.; Eastman, J. Ronald

    2010-11-01

    This study examined the effects of climate teleconnections on the massive Caribbean coral bleaching and mortality event of 2005. A relatively new analytical procedure known as empirical orthogonal teleconnection (EOT) analysis, based on a 26 year monthly time series of observed sea surface temperature (SST), was employed. Multiple regression analysis was then utilized to determine the relative teleconnection contributions to SST variability in the southern Caribbean. The results indicate that three independent climate teleconnections had significant impact on southern Caribbean anomalies in SST and that their interaction was a major contributor to the anomalously high temperatures in 2005. The primary and approximately equal contributors were EOT-5 and EOT-2, which correlate most strongly with the tropical North Atlantic (TNA) and Atlantic multidecadal oscillation (AMO) climate indices, respectively. The third, EOT-9, was most strongly related to the Atlantic meridional mode. However, although statistically significant, the magnitude of its contribution to southern Caribbean variability was small. While there is debate over the degree to which the recent AMO pattern represents natural variability or global ocean warming, the results presented here indicate that natural variability played a strong role in the 2005 coral bleaching conditions. They also argue for a redefinition of the geography of TNA variability.

  14. Parametric studies of penetration events : a design and analysis of experiments approach.

    SciTech Connect

    Chiesa, Michael L.; Marin, Esteban B.; Booker, Paul M.

    2005-02-01

    A numerical screening study of the interaction between a penetrator and a geological target with a preformed hole has been carried out to identify the main parameters affecting the penetration event. The planning of the numerical experiment was based on the orthogonal array OA(18,7,3,2), which allows 18 simulation runs with 7 parameters at 3 levels each. The strength of 2 of the array allows also for two-factor interaction studies. The seven parameters chosen for this study are: penetrator offset, hole diameter, hole taper, vertical and horizontal velocity of the penetrator, angle of attack of the penetrator and target material. The analysis of the simulation results has been based on main effects plots and analysis of variance (ANOVA), and it has been performed using three metrics: the maximum values of the penetration depth, penetrator deceleration and plastic strain in the penetrator case. This screening study shows that target material has a major influence on penetration depth and penetrator deceleration, while penetrator offset has the strongest effect on the maximum plastic strain.

  15. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    SciTech Connect

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  16. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  17. ANTARES: The Arizona-NOAO Temporal Analysis and Response to Events System

    NASA Astrophysics Data System (ADS)

    Matheson, T.; Saha, A.; Snodgrass, R.; Kececioglu, J.

    The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. The goal is to build the software infrastructure necessary to process and filter alerts produced by time-domain surveys, with the ultimate source of such alerts being the Large Synoptic Survey Telescope (LSST). ANTARES will add value to alerts by annotating them with information from external sources such as previous surveys from across the electromagnetic spectrum. In addition, the temporal history of annotated alerts will provide further annotation for analysis. These alerts will go through a cascade of filters to select interesting candidates. For the prototype, 'interesting' is defined as the rarest or most unusual alert, but future systems will accommodate multiple filtering goals. The system is designed to be flexible, allowing users to access the stream at multiple points throughout the process, and to insert custom filters where necessary. We will describe the basic architecture of ANTARES and the principles that will guide development and implementation.

  18. Microseismic event location using the Double-difference technique for multiplet analysis

    NASA Astrophysics Data System (ADS)

    Castellanos Jurado, Fernando Rafael

    Microseismic event location provides a plethora of information about underground processes such as hydraulic fracturing, steam injection or mining and volcano activities. Nevertheless, accuracy is limited by acquisition geometry and errors in the velocity model and time picks. Although microseismic events can happen anywhere, they tend to re-occur in the same zone. This thesis describes a post-processing technique to relocate events originated in the same source region based on the double-difference method. This technique includes a crosscorrelation procedure to detect similar events and correct time picking errors. The performance of the algorithm is tested on synthetic data and a set of microseismic events recorded in a mine. The method significantly improves locations of similar events, compared to a conventional grid-search algorithm, revealing seismicity patterns likely associated with routine mining operations. The method also includes plots used for quality control of time picking and event location, facilitating geological interpretations.

  19. Regularized Deterministic Annealing Hidden Markov Models for Identificationand Analysis of Seismic and Aseismic events.

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Clayton, R.; Kedar, S.; Kaneko, Y.

    2003-12-01

    We employ a robust hidden Markov model (HMM) based technique to perform statistical pattern analysis of suspected seismic and aseismic events in the poorly explored period band of minutes to hours. The technique allows us to classify known events and provides a statistical basis for finding and cataloging similar events represented elsewhere in the observations. In this work, we focus on data collected by the Southern California TriNet system. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. Application of the model to data involves optimizing these model parameters with respect to some function of the observations, typically the likelihood of the observations given the model. Our work focused on the fact that this objective function has a number of local maxima that is exponential in the model size (the number of states). This means that not only is it very difficult to discover the global maximum, but also that results can vary widely between applications of the model. For some domains which employ HMMs for such purposes, such as speech processing, sufficient a priori information about the system is available to avoid this problem. However, for seismic data in general such a priori information is not available. Our approach involves analytical location of sub-optimal local maxima; once the locations of these maxima have been found, then we can employ a

  20. Analysis of Scaling Parameters of Event Magnitudes by Fluid Injections in Reservoirs

    NASA Astrophysics Data System (ADS)

    Dinske, Carsten; Krüger, Oliver; Shapiro, Serge

    2014-05-01

    We continue to elaborate scaling parameters of observed frequency-magnitude distributions of injection-induced seismicity. In addition to pumped fluid mass, b-value and seismogenic index (Shapiro et al., 2010, Dinske and Shapiro, 2013), one more scaling was recognised by the analysis of the induced event magnitudes. A frequently observed under-representation of events with larger magnitudes in comparison with the Gutenberg-Richter relation is explained by the geometry and the dimensions of the hydraulically stimulated rock volume (Shapiro et al., 2011, 2013). This under-representation, however, introduces a bias in b-value estimations which then should be considered as an apparent and transient b-value depending on the size of the perturbed rock volume. We study in detail in which way the seismogenic index estimate is affected by the apparent b-value. For this purpose, we compare b-value and seismogenic index estimates using two different approaches. First, we perform standard Gutenberg-Richter power-law fitting and second, we apply frequency-magnitude lower bound probability fitting as proposed by Shapiro et al. (2013). The latter takes into account the finite size of the perturbed rock volume. Our result reveals that the smaller is the perturbed rock volume the larger are the deviations between the two sets of derived parameters. It means that the magnitude statistics of the induced events is most affected for low injection volumes and/or short injection times. At sufficiently large stimulated volumes both fitting approaches provide comparable b-value and seismogenic index estimates. In particular, the b-value is then in the range of b-values universally obtained for tectonic earthquakes (i.e., 0.8 - 1.2). Based on our findings, we introduce the specific magnitude which is a seismotectonic characteristic for a reservoir location. Defined as the ratio of seismogenic index and b-value, the specific magnitude is found to be a magnitude scaling parameter which is

  1. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  2. 'HESPERIA' HORIZON 2020 project: High Energy Solar Particle Events foRecastIng and Analysis

    NASA Astrophysics Data System (ADS)

    Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma; Bindi, Veronica; Murphy, Ronald; Tyka, Allan J.; Rodriguez, Juan

    2016-04-01

    Solar energetic particles (SEPs) are of prime interest for fundamental astrophysics. However, due to their high energies they are a space weather concern for technology in space as well as human space exploration calling for reliable tools with predictive capabilities. The two-year EU HORIZON 2020 project HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis, http://www.hesperia-space.eu/) will produce two novel operational SEP forecasting tools based upon proven concepts (UMASEP, REleASE). At the same time the project will advance our understanding of the physical mechanisms that result into high-energy SEP events through the systematic exploitation of the high-energy gamma-ray observations of the FERMI mission and other novel published datasets (PAMELA, AMS), together with in situ SEP measurements near 1 AU. By using multi-frequency observations and performing simulations, the project will address the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to their detection near 1 AU. Furthermore, HESPERIA will explore the possibility of incorporating the derived results into future innovative space weather services. Publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters, giving information on the high-energy processes occurring at or near the Sun during solar eruptions, will be provided for the first time. The results of this inversion software will complement the space-borne measurements at adjacent higher energies. In order to achieve these goals HESPERIA will exploit already existing large datasets that are stored into databases built under EU FP7 projects NMDB and SEPServer. The structure of the HESPERIA project, its main objectives and forecasting operational tools, as well as the added value to SEP research will be presented and discussed. Acknowledgement: This project has received funding from the

  3. Metamizole-Associated Adverse Events: A Systematic Review and Meta-Analysis

    PubMed Central

    Fässler, Margrit; Blozik, Eva; Linde, Klaus; Jüni, Peter; Reichenbach, Stephan; Scherer, Martin

    2015-01-01

    Background Metamizole is used to treat pain in many parts of the world. Information on the safety profile of metamizole is scarce; no conclusive summary of the literature exists. Objective To determine whether metamizole is clinically safe compared to placebo and other analgesics. Methods We searched CENTRAL, MEDLINE, EMBASE, CINAHL, and several clinical trial registries. We screened the reference lists of included trials and previous systematic reviews. We included randomized controlled trials that compared the effects of metamizole, administered to adults in any form and for any indication, to other analgesics or to placebo. Two authors extracted data regarding trial design and size, indications for pain medication, patient characteristics, treatment regimens, and methodological characteristics. Adverse events (AEs), serious adverse events (SAEs), and dropouts were assessed. We conducted separate meta-analyses for each metamizole comparator, using standard inverse-variance random effects meta-analysis to pool the estimates across trials, reported as risk ratios (RRs). We calculated the DerSimonian and Laird variance estimate T2 to measure heterogeneity between trials. The pre-specified primary end point was any AE during the trial period. Results Of the 696 potentially eligible trials, 79 trials including almost 4000 patients with short-term metamizole use of less than two weeks met our inclusion criteria. Fewer AEs were reported for metamizole compared to opioids, RR = 0.79 (confidence interval 0.79 to 0.96). We found no differences between metamizole and placebo, paracetamol and NSAIDs. Only a few SAEs were reported, with no difference between metamizole and other analgesics. No agranulocytosis or deaths were reported. Our results were limited by the mediocre overall quality of the reports. Conclusion For short-term use in the hospital setting, metamizole seems to be a safe choice when compared to other widely used analgesics. High-quality, adequately sized

  4. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  5. Adverse events in coronary artery bypass graft (CABG) trials: a systematic review and analysis

    PubMed Central

    Nalysnyk, L; Fahrbach, K; Reynolds, M W; Zhao, S Z; Ross, S

    2003-01-01

    Objectives: To quantify the incidence of major adverse events (AEs) occurring in hospital or within 30 days after surgery in patients undergoing coronary artery bypass graft (CABG) surgery and to identify risk factors for these AEs. Methods: Systematic review and analysis of studies published in English since 1990. Studies of isolated standard CABG reporting postoperative incidence of myocardial infarction (MI), stroke, gastrointestinal bleeding, renal failure, or death in hospital or within 30 days were eligible for inclusion. Incidence of these events was calculated overall and for selected patient groups defined by all elective CABG versus mixed (some non-elective); mean ejection fraction ⩽ 50% versus > 50%; mean age ⩽ 60 versus > 60 years; primary CABG versus some reoperations; randomised controlled trials versus cohort studies; and single centre versus multicentre studies. Odds ratios of selected AEs were computed according to group risk factors. Results: 176 studies (205 717 patients) met all inclusion criteria. The average incidence of major AEs occurring in-hospital was death (1.7%); non-fatal MI (2.4%); non-fatal stroke (1.3%); gastrointestinal bleeding (1.5%); and renal failure (0.8%). Thirty day mortality was 2.1%. Meta-analyses show that age > 70, female sex, low ejection fraction, history of stroke, MI, or heart surgery, and presence of diabetes or hypertension are all associated with increased 30 day mortality after CABG. Conclusion: The incidence of major AEs in patients after CABG varies widely across studies and patient populations, and this heterogeneity must be controlled when using the literature to benchmark safety. PMID:12807853

  6. Analysis of the observed and forecast rainfall intensity structure in a precipitation event

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Molinié, Gilles; Karakasidis, Theodoros; Anquentin, Sandrine; Creutin, Jean Dominique; Pinty, Jean-Pierre; Escobar, Juan

    2014-05-01

    During the last decades a number of studies have been devoted to examine the precipitation field temporal and spatial structure, given the fact that rainfall exhibits large variability at all scales (see for example Ceresetti et al. 2011, 2012). The objective of this study is to examine the rainfall field structure at high temporal (15 minute) and spatial (1 km) resolution. We focus on rainfall properties such as the intermittency using the auto-correlation of precipitation time series to assess if it can be modelled assuming a fractal behaviour and considering different scales. Based on the results and methodology used in previous studies applied to observational precipitation data such as raingauge, weather radar and disdrometer observations (see for example Molinié et al., 2011, 2013), in this case we employ high resolution numerical forecast data. In particular our approach considers using a transitive covariogram, given the limited number of samples available in single precipitation events. Precipitation forecasts are derived at 15 minute intervals from 1-km grid length nested simulations of the non-hydrostatic mesoscale atmospheric model of the French research community Meso-NH, using AROME-WestMed model data as initial and boundary conditions. The analysis also considers existing data available in the Hymex (HYdrological cycle in the Mediterranean EXperiment) data base. Results are presented of a precipitation event that took place in the Rhône Valley (France) in November 2011. This case allows to study with the proposed methodology the effect of a number of factors (different orography along the Rhône Valley, turbulence, microphysical processes, etc.) on the observed and simulated precipitation field. References Ceresetti D., E. Ursu, J. Carreau, S. Anquetin, J. D. Creutin, L. Gardes, S. Girard, and G. Molinié, 2012: Evaluation of classical spatial-analysis schemes of extreme rainfall. Natural Hazards and Earth System Sciences, 12, 3229-3240, http

  7. A new strategy for sensitivity analysis when modelling extreme events in the geosciences

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Wagener, Thorsten

    2014-05-01

    Natural hazard models - used to predict and evaluate extreme events like prolonged droughts, floods, windstorms, etc. - are affected by unavoidable and potentially large uncertainty. Uncertainty sources are manifold, including simplifying assumptions in the model structure (e.g. coarse spatial resolution), uncertain parameter values, measurement errors, etc. Global Sensitivity Analysis (GSA) can be used to assess the relative contributions from these different sources to the uncertainty in the model predictions. By providing insights into the model behavior and potential for simplification, GSA indicates where further data collection and research is needed or would be beneficial, and enhances the credibility of the modelling results. In this work we present a novel Regional-Global approach for Sensitivity Analysis. The method is "global" in the model inputs and "regional" in the output, that is, it considers variations of the uncertain inputs across their entire feasibility range but can be focused on their effects on a specific region of the model response, e.g. extreme values. The method is therefore especially promising for natural hazard applications where the focus in on the effect of uncertain inputs on specific range of values of the model output. The main underlying idea is to measure sensitivity by the distance between the unconditional distribution of the model output (i.e. when all input factors vary) and the conditional distribution when one of the input factors is fixed. Such sensitivity measures can be computed either over the entire range of the output distribution or tuned to consider only a sub-range, for instance the tail of the distribution. We use several natural hazards examples to demonstrate the approach and compare it to other widely applied GSA methods like Sobol and Regional Sensitivity Analysis.

  8. Copy number and targeted mutational analysis reveals novel somatic events in metastatic prostate tumors.

    PubMed

    Robbins, Christiane M; Tembe, Waibov A; Baker, Angela; Sinari, Shripad; Moses, Tracy Y; Beckstrom-Sternberg, Stephen; Beckstrom-Sternberg, James; Barrett, Michael; Long, James; Chinnaiyan, Arul; Lowey, James; Suh, Edward; Pearson, John V; Craig, David W; Agus, David B; Pienta, Kenneth J; Carpten, John D

    2011-01-01

    Advanced prostate cancer can progress to systemic metastatic tumors, which are generally androgen insensitive and ultimately lethal. Here, we report a comprehensive genomic survey for somatic events in systemic metastatic prostate tumors using both high-resolution copy number analysis and targeted mutational survey of 3508 exons from 577 cancer-related genes using next generation sequencing. Focal homozygous deletions were detected at 8p22, 10q23.31, 13q13.1, 13q14.11, and 13q14.12. Key genes mapping within these deleted regions include PTEN, BRCA2, C13ORF15, and SIAH3. Focal high-level amplifications were detected at 5p13.2-p12, 14q21.1, 7q22.1, and Xq12. Key amplified genes mapping within these regions include SKP2, FOXA1, and AR. Furthermore, targeted mutational analysis of normal-tumor pairs has identified somatic mutations in genes known to be associated with prostate cancer including AR and TP53, but has also revealed novel somatic point mutations in genes including MTOR, BRCA2, ARHGEF12, and CHD5. Finally, in one patient where multiple independent metastatic tumors were available, we show common and divergent somatic alterations that occur at both the copy number and point mutation level, supporting a model for a common clonal progenitor with metastatic tumor-specific divergence. Our study represents a deep genomic analysis of advanced metastatic prostate tumors and has revealed candidate somatic alterations, possibly contributing to lethal prostate cancer. PMID:21147910

  9. Copy number and targeted mutational analysis reveals novel somatic events in metastatic prostate tumors

    PubMed Central

    Robbins, Christiane M.; Tembe, Waibov A.; Baker, Angela; Sinari, Shripad; Moses, Tracy Y.; Beckstrom-Sternberg, Stephen; Beckstrom-Sternberg, James; Barrett, Michael; Long, James; Chinnaiyan, Arul; Lowey, James; Suh, Edward; Pearson, John V.; Craig, David W.; Agus, David B.; Pienta, Kenneth J.; Carpten, John D.

    2011-01-01

    Advanced prostate cancer can progress to systemic metastatic tumors, which are generally androgen insensitive and ultimately lethal. Here, we report a comprehensive genomic survey for somatic events in systemic metastatic prostate tumors using both high-resolution copy number analysis and targeted mutational survey of 3508 exons from 577 cancer-related genes using next generation sequencing. Focal homozygous deletions were detected at 8p22, 10q23.31, 13q13.1, 13q14.11, and 13q14.12. Key genes mapping within these deleted regions include PTEN, BRCA2, C13ORF15, and SIAH3. Focal high-level amplifications were detected at 5p13.2-p12, 14q21.1, 7q22.1, and Xq12. Key amplified genes mapping within these regions include SKP2, FOXA1, and AR. Furthermore, targeted mutational analysis of normal-tumor pairs has identified somatic mutations in genes known to be associated with prostate cancer including AR and TP53, but has also revealed novel somatic point mutations in genes including MTOR, BRCA2, ARHGEF12, and CHD5. Finally, in one patient where multiple independent metastatic tumors were available, we show common and divergent somatic alterations that occur at both the copy number and point mutation level, supporting a model for a common clonal progenitor with metastatic tumor-specific divergence. Our study represents a deep genomic analysis of advanced metastatic prostate tumors and has revealed candidate somatic alterations, possibly contributing to lethal prostate cancer. PMID:21147910

  10. An Internal Evaluation of the National FFA Agricultural Mechanics Career Development Event through Analysis of Individual and Team Scores from 1996-2006

    ERIC Educational Resources Information Center

    Franklin, Edward A.; Armbruster, James

    2012-01-01

    The purpose of this study was to conduct an internal evaluation of the National FFA Agricultural Mechanics Career Development Event (CDE) through analysis of individual and team scores from 1996-2006. Data were analyzed by overall and sub-event areas scores for individual contestants and team event. To facilitate the analysis process scores were…

  11. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  12. Northern San Andreas Earthquake Recurrence: Rupture lengths, Correlations and Constrained OxCal Analysis of Event Ages

    NASA Astrophysics Data System (ADS)

    Morey, A. E.; Goldfinger, C.; Erhardt, M.; Nelson, C. H.; Johnson, J. E.; Gutierrez-Pastor, J.

    2005-12-01

    We are using multiple proxies, including XRF analysis, to determine hemipelagic thickness between turbidite events recorded in cores along the offshore northern San Andreas margin. Inter-event times are calculated from these improved estimates of sediment thickness and regression-determined sedimentation rates, and used along with known stratigraphic information to constrain calibrated radiocarbon age ranges using Bayesian statistical methods within the program OxCal. OxCal can also be used to combine multiple ages for the same event. Multiple ages are given "credit" where age ranges overlap, resulting in reduced 1- or 2-sigma age ranges compared to averaging peak ages and propagating errors. These methods reduce calendar age variability of events along strike that are thought to correlate. We tested three methods of estimating calendar ages, using the most recent events in a Noyo Canyon core. These methods are: 1. unconstrained radiocarbon age calibration, 2. age determination using known dates and inter-event time calculated from hemipelagic thickness and the regression-determined sedimentation rate, and 3. (preferred method) use OxCal's sequence option to calibrate and constrain radiocarbon ages given all available stratigraphic information, including date of collection, historical or geological datums, inter-event times and radiocarbon ages. The upper-most event was chosen for these tests because it is known to be the 1906 earthquake and the 20th century reservoir correction is well known in this area. The penultimate event was chosen because it has been dated at multiple land sites. 1906 event: Unconstrained calibration: calibration of the radiocarbon age of the 1906 event yields an age of ~1913, (1σ: 1898-1940). Sedimentation time: subtracting the time represented by the hemipelagic thickness above the 1906 event from the date of collection (1999) yields an age of ~1904. OxCal sequence: constrained calibration yields an age of ~1902 (1σ: 1880

  13. The analysis of the events of stellar visibility in Pliny's "Natural History"

    NASA Astrophysics Data System (ADS)

    Nickiforov, M. G.

    2016-07-01

    The Book XVIII of Pliny's "Natural History" contains about a hundred descriptions of the events of stellar visibility, which were used for the needs of agricultural calendar. The comparison between the calculated date of each event and the date given by Pliny shows that actual events of stellar visibility occurred systematically about ~10 days later with respect to the specified time. This discrepancy cannot be explained by errors of the calendar.

  14. Incidence and pattern of 12 years of reported transfusion adverse events in Zimbabwe: a retrospective analysis

    PubMed Central

    Mafirakureva, Nyashadzaishe; Khoza, Star; Mvere, David A.; Chitiyo, McLeod E.; Postma, Maarten J.; van Hulst, Marinus

    2014-01-01

    Background Haemovigilance hinges on a systematically structured reporting system, which unfortunately does not always exist in resource-limited settings. We determined the incidence and pattern of transfusion-related adverse events reported to the National Blood Service Zimbabwe. Materials and methods A retrospective review of the transfusion-event records of the National Blood Service Zimbabwe was conducted covering the period from 1 January 1999 to 31 December 2011. All transfusion-related event reports received during the period were analysed. Results A total of 308 transfusion adverse events (0.046%) were reported for 670,625 blood components distributed. The majority (61.6%) of the patients who experienced an adverse event were female. The median age was 36 years (range, 1–89 years). The majority (68.8%) of the adverse events were acute transfusion reactions consisting of febrile non-haemolytic transfusion reactions (58.5%), minor allergies (31.6%), haemolytic reactions (5.2%), severe allergic reactions (2.4%), anaphylaxis (1.4%) and hypotension (0.9%). Two-thirds (66.6%) of the adverse events occurred following administration of whole blood, although only 10.6% of the blood was distributed as whole blood. Packed cells, which accounted for 75% of blood components distributed, were associated with 20.1% of the events. Discussion The incidence of suspected transfusion adverse events was generally lower than the incidences reported globally in countries with well-established haemovigilance systems. The administration of whole blood was disproportionately associated with transfusion adverse events. The pattern of the transfusion adverse events reported here highlights the probable differences in practice between different settings. Under-reporting of transfusion events is rife in passive reporting systems. PMID:24887217

  15. Dealing With Major Life Events and Transitions: A Systematic Literature Review on and Occupational Analysis of Spirituality.

    PubMed

    Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter

    2016-01-01

    This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life event or transition. The researchers investigated studies that used narrative analysis or a phenomenological methodology related to the topic. Thematic analysis resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life event or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention. PMID:27294990

  16. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam

    2016-04-01

    Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic events. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an analysis of post-blasting events using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on underground seismic network. We show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC) component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for common strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  17. Statistical-noise reduction in correlation analysis of high-energy nuclear collisions with event-mixing

    NASA Astrophysics Data System (ADS)

    Ray, R. L.; Bhattarai, P.

    2016-06-01

    The error propagation and statistical-noise reduction method of Reid and Trainor for two-point correlation applications in high-energy collisions is extended to include particle-pair references constructed by mixing two particles from all event-pair combinations within event subsets of arbitrary size. The Reid-Trainor method is also applied to other particle-pair mixing algorithms commonly used in correlation analysis of particle production from high-energy nuclear collisions. The statistical-noise reduction, inherent in the Reid-Trainor event-mixing procedure, is shown to occur for these other event-mixing algorithms as well. Monte Carlo simulation results are presented which verify the predicted degree of noise reduction. In each case the final errors are determined by the bin-wise particle-pair number, rather than by the bin-wise single-particle count.

  18. Assessment of Adverse Events in Protocols, Clinical Study Reports, and Published Papers of Trials of Orlistat: A Document Analysis

    PubMed Central

    Schroll, Jeppe Bennekou; Penninga, Elisabeth I.; Gøtzsche, Peter C.

    2016-01-01

    filters, though six of seven papers stated that “all adverse events were recorded.” For one trial, we identified an additional 1,318 adverse events that were not listed or mentioned in the CSR itself but could be identified through manually counting individual adverse events reported in an appendix. We discovered that the majority of patients had multiple episodes of the same adverse event that were only counted once, though this was not described in the CSRs. We also discovered that participants treated with orlistat experienced twice as many days with adverse events as participants treated with placebo (22.7 d versus 14.9 d, p-value < 0.0001, Student’s t test). Furthermore, compared with the placebo group, adverse events in the orlistat group were more severe. None of this was stated in the CSR or in the published paper. Our analysis was restricted to one drug tested in the mid-1990s; our results might therefore not be applicable for newer drugs. Conclusions In the orlistat trials, we identified important disparities in the reporting of adverse events between protocols, clinical study reports, and published papers. Reports of these trials seemed to have systematically understated adverse events. Based on these findings, systematic reviews of drugs might be improved by including protocols and CSRs in addition to published articles. PMID:27529343

  19. Role of Stratospheric Air in a Severe Weather Event: Analysis of Potential Vorticity and Total Ozone

    NASA Technical Reports Server (NTRS)

    Goering, Melissa A.; Gallus, William A., Jr.; Olsen, Mark A.; Stanford, John L.

    2001-01-01

    The role of dry stratospheric air descending to low and middle tropospheric levels in a severe weather outbreak in the midwestern United States is examined using ACCEPT Eta model output, Rapid Update Cycle (RUC) analyses, and Earth probe Total Ozone Mapping Spectrometer (EP/TOMS) total ozone data. While stratospheric air was not found to play a direct role in the convection, backward trajectories show stratospheric air descended to 800 hPa just west of the convection. Damaging surface winds not associated with thunderstorms also occurred in the region of greatest stratospheric descent. Small-scale features in the high-resolution total ozone data compare favorably with geopotential heights and potential vorticity fields, supporting the notion that stratospheric air descended to near the surface. A detailed vertical structure in the potential vorticity appears to be captured by small-scale total ozone variations. The capability of the total ozone to identify mesoscale features assists model verification. The total ozone data suggest biases in the RUC analysis and Eta forecast of this event. The total ozone is also useful in determining whether potential vorticity is of stratospheric origin or is diabatically generated in the troposphere.

  20. Automated detection and analysis of depolarization events in human cardiomyocytes using MaDEC.

    PubMed

    Szymanska, Agnieszka F; Heylman, Christopher; Datta, Rupsa; Gratton, Enrico; Nenadic, Zoran

    2016-08-01

    Optical imaging-based methods for assessing the membrane electrophysiology of in vitro human cardiac cells allow for non-invasive temporal assessment of the effect of drugs and other stimuli. Automated methods for detecting and analyzing the depolarization events (DEs) in image-based data allow quantitative assessment of these different treatments. In this study, we use 2-photon microscopy of fluorescent voltage-sensitive dyes (VSDs) to capture the membrane voltage of actively beating human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). We built a custom and freely available Matlab software, called MaDEC, to detect, quantify, and compare DEs of hiPS-CMs treated with the β-adrenergic drugs, propranolol and isoproterenol. The efficacy of our software is quantified by comparing detection results against manual DE detection by expert analysts, and comparing DE analysis results to known drug-induced electrophysiological effects. The software accurately detected DEs with true positive rates of 98-100% and false positive rates of 1-2%, at signal-to-noise ratios (SNRs) of 5 and above. The MaDEC software was also able to distinguish control DEs from drug-treated DEs both immediately as well as 10min after drug administration. PMID:27281718

  1. Analysis of the relationship between longitudinal gene expressions and ordered categorical event data

    PubMed Central

    Rajicic, Natasa; Finkelstein, Dianne M.; Schoenfeld, David A.

    2013-01-01

    SUMMARY The NIH project ”Inflammatory and Host Response to Injury” (Glue) is being conducted to study the changes in the body over time in response to trauma and burn. Patients are monitored for changes in their clinical status, such as the onset of and recovery from organ failure. Blood samples are drawn over the first days and weeks after the injury to obtain gene expression levels over time. Our goal was to develop a method of selecting genes that differentially expressed in patients who either improved or experienced organ failure. For this, we needed a test for the association between longitudinal gene expressions and the time to the occurrence of ordered categorical outcomes indicating recovery, stable disease, and organ failure. We propose a test for which the relationship between the gene expression and the events is modeled using the cumulative proportional odds model that is a generalization of the Pooling Repeated Observation (PRO) method. Given the high-dimensionality of the microarray data, it was necessary to control for the multiplicity of the testing. To control for the false discovery rate (FDR), we applied both a permutational approach as well as Efron's empirical estimation methods. We explore our method through simulations and provide the analysis of the multi-center, longitudinal study of immune response to inflammation and trauma (http://www.gluegrant.org). PMID:19618375

  2. Men’s and women’s migration in coastal Ghana: An event history analysis

    PubMed Central

    Reed, Holly E.; Andrzejewski, Catherine S.; White, Michael J.

    2013-01-01

    This article uses life history calendar (LHC) data from coastal Ghana and event history statistical methods to examine inter-regional migration for men and women, focusing on four specific migration types: rural-urban, rural-rural, urban-urban, and urban-rural. Our analysis is unique because it examines how key determinants of migration— including education, employment, marital status, and childbearing—differ by sex for these four types of migration. We find that women are significantly less mobile than men overall, but that more educated women are more likely to move (particularly to urban areas) than their male counterparts. Moreover, employment in the prior year is less of a deterrent to migration among women. While childbearing has a negative effect on migration, this impact is surprisingly stronger for men than for women, perhaps because women’s search for assistance in childcare promotes migration. Meanwhile, being married or in union appears to have little effect on migration probabilities for either men or women. These results demonstrate the benefits of a LHC approach and suggest that migration research should further examine men’s and women’s mobility as it relates to both human capital and household and family dynamics, particularly in developing settings. PMID:24298203

  3. Analysis of Individual Molecular Events of DNA Damage Response by Flow and Image Assisted Cytometry

    PubMed Central

    Darzynkiewicz, Zbigniew; Traganos, Frank; Zhao, Hong; Halicka, H. Dorota; Skommer, Joanna; Wlodkowic, Donald

    2010-01-01

    This chapter describes molecular mechanisms of DNA damage response (DDR) and presents flow- and image-assisted cytometric approaches to assess these mechanisms and measure the extent of DDR in individual cells. DNA damage was induced by cell treatment with oxidizing agents, UV light, DNA topoisomerase I or II inhibitors, cisplatin, tobacco smoke, and by exogenous and endogenous oxidants. Chromatin relaxation (decondensation) is an early event of DDR chromatin that involves modification of high mobility group proteins (HMGs) and histone H1 and was detected by cytometry by analysis of the susceptibility of DNA in situ to denaturation using the metachromatic fluorochrome acridine orange. Translocation of the MRN complex consisting of Meiotic Recombination 11 Homolog A (Mre11), Rad50 homolog and Nijmegen Breakage Syndrome 1 (NMR1) into DNA damage sites was assessed by laser scanning cytometry as the increase in the intensity of maximal pixel as well as integral value of Mre11 immunofluorescence. Examples of cytometric detection of activation of Ataxia telangiectasia mutated (ATM), and Check 2 (Chk2) protein kinases using phospho-specific Abs targeting Ser1981 and Thr68 of these proteins, respectively are also presented. We also discuss approaches to correlate activation of ATM and Chk2 with phosphorylation of p53 on Ser15 and histone H2AX on Ser139 as well as with cell cycle position and DNA replication. The capability of laser scanning cytometry to quantify individual foci of phosphorylated H2AX and/or ATM that provides more dependable assessment of the presence of DNA double-strand breaks is outlined. The new microfluidic Lab-on-a-Chip platforms for interrogation of individual cells offer a novel approach for DDR cytometric analysis. PMID:21722802

  4. Subtropical influence on January 2009 major sudden stratospheric warming event: diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Schneidereit, Andrea; Peters, Dieter; Grams, Christian; Wolf, Gabriel; Riemer, Michael; Gierth, Franziska; Quinting, Julian; Keller, Julia; Martius, Olivia

    2015-04-01

    In January 2009 a major sudden stratospheric warming (MSSW) event occurred with the strongest NAM anomaly ever observed at 10 hPa. Also stratospheric Eliassen-Palm flux convergence and zonal mean eddy heat fluxes of ultra-long waves at 100 hPa layer were unusually strong in the mid-latitudes just before and after the onset of the MSSW. Beside internal interactions between the background flow and planetary waves and between planetary waves among themselves the subtropical tropospheric forcing of these enhanced heat fluxes is still an open question. This study investigates in more detail the dynamical reasons for the pronounced heat fluxes based on ERA-Interim re-analysis data. Investigating the regional contributions of the eddy heat flux to the northern hemispheric zonal mean revealed a distinct spatial pattern with maxima in the Eastern Pacific/North America and the Eastern North Atlantic/ Europe in that period. The first region is related with an almost persistent tropospheric blocking high (BH) over the Gulf of Alaska dominating the upper-level flow and the second region with a weaker BH over Northern Europe. The evolution of the BH over the Gulf of Alaska can be explained by a chain of tropospheric weather events linked to and maintained by subtropical and tropical influences: MJO (phase 7-8) and the developing cold phase of ENSO (La Niña), which are in coherence over the Eastern Pacific favor enhanced subtropical baroclinicity. In turn extratropical cyclone activity increases and shifts more poleward associated with an increase of the frequency of warm conveyor belts (WCB). These WCBs support enhanced poleward directed eddy heat fluxes in Eastern Pacific/North-American region. The Eastern North Atlantic/European positive heat flux anomaly is associated with a blocking high over Scandinavia. This BH is maintained by an eastward propagating Rossby wave train, emanating from the block over the Gulf of Alaska. Eddy feedback processes support this high pressure

  5. A Prospective Analysis of Life Events, Problem Behaviours and Depression in Adults with Intellectual Disability

    ERIC Educational Resources Information Center

    Esbensen, A. J.; Benson, B. A.

    2006-01-01

    Background: Life events have consistently been found to be associated with behaviour problems and depression among individuals with intellectual disability (ID). However, prior findings have typically been based on correlational or retrospective analyses of case files. The current study attempted to replicate prior findings from life events with…

  6. Event based analysis of Chlorothalonil concentrations following application to managed turf

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Chlorothalonil concentrations exceeding acute toxicity levels for certain organisms have been measured in surface water discharge events from managed turf watersheds. However, the duration of exceedence and the timing of those events with respect to precipitation/runoff and time since application ha...

  7. Multidimensional tensor array analysis of multiphase flow during a hydrodynamic ram event

    NASA Astrophysics Data System (ADS)

    Lingenfelter, A.; Liu, D.

    2015-12-01

    Flow visualization is necessary to characterize the fluid flow properties during a hydrodynamic ram event. The multiphase flow during a hydrodynamic ram event can make traditional image processing techniques such as contrast feature detection and PIV difficult. By stacking the imagery to form a multidimensional tensor array, feature detection to determine flow field velocities are visualized.

  8. A latent class analysis of adolescent adverse life events based on a Danish national youth probability sample.

    PubMed

    Shevlin, Mark; Elklit, Ask

    2008-01-01

    The aim of this study was to determine if there are meaningful clusters of individuals with similar experiences of adverse life events in a nationally representative sample of Danish adolescents. Latent class analysis (LCA) was used to identify such clusters or latent classes. In addition, the relationships between the latent classes and living arrangements and diagnosis of post-traumatic stress disorder (PTSD) were estimated. A four-class solution was found to be the best description of multiple adverse life events, and the classes were labelled "Low Risk", "Intermediate Risk", "Pregnancy" and "High Risk". Compared with the Low Risk class, the other classes were found to be significantly more likely to have a diagnosis PTSD and live with only one parent. This paper demonstrated how trauma research can focus on the individual as the unit of analysis rather than traumatic events. PMID:18609032

  9. Multi-Resolution Clustering Analysis and Visualization of Around One Million Synthetic Earthquake Events

    NASA Astrophysics Data System (ADS)

    Kaneko, J. Y.; Yuen, D. A.; Dzwinel, W.; Boryszko, K.; Ben-Zion, Y.; Sevre, E. O.

    2002-12-01

    The study of seismic patterns with synthetic data is important for analyzing the seismic hazard of faults because one can precisely control the spatial and temporal domains. Using modern clustering analysis from statistics and a recently introduced visualization software, AMIRA, we have examined the multi-resolution nature of a total assemblage involving 922,672 earthquake events in 4 numerically simulated models, which have different constitutive parameters, with 2 disparately different time intervals in a 3D spatial domain. The evolution of stress and slip on the fault plane was simulated with the 3D elastic dislocation theory for a configuration representing the central San Andreas Fault (Ben-Zion, J. Geophys. Res., 101, 5677-5706, 1996). The 4 different models represent various levels of fault zone disorder and have the following brittle properties and names: uniform properties (model U), a Parkfield type Asperity (A), fractal properties (F), and multi-size-heterogeneities (model M). We employed the MNN (mutual nearest neighbor) clustering method and developed a C-program that calculates simultaneously a number of parameters related to the location of the earthquakes and their magnitude values .Visualization was then used to look at the geometrical locations of the hypocenters and the evolution of seismic patterns. We wrote an AmiraScript that allows us to pass the parameters in an interactive format. With data sets consisting of 150 year time intervals, we have unveiled the distinctly multi-resolutional nature in the spatial-temporal pattern of small and large earthquake correlations shown previously by Eneva and Ben-Zion (J. Geophys. Res., 102, 24513-24528, 1997). In order to search for clearer possible stationary patterns and substructures within the clusters, we have also carried out the same analysis for corresponding data sets with time extending to several thousand years. The larger data sets were studied with finer and finer time intervals and multi

  10. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    ' precipitation forecasts showed some skill (improvement over persistence) for lead times up to 60' for moderate intensities (up to 1 mm in 30') and up to 2.5h for lower rates (above 0.1 mm). However an important event-to-event variability has been found as illustrated by the fact that hit rates of rain-no-rain forecasts achieved the 60% value at 90' in the 7 September 2005 and only 40' in the 2 November 2008 case. The discussion of these results provides useful information on the potential application of nowcasting systems and realistic values to be contrasted with specific end-user requirements. This work has been done in the framework of the Hymex research programme and has been partly funded by the ProFEWS project (CGL2010-15892). References Bech J, N Pineda, T Rigo, M Aran, J Amaro, M Gayà, J Arús, J Montanyà, O van der Velde, 2011: A Mediterranean nocturnal heavy rainfall and tornadic event. Part I: Overview, damage survey and radar analysis. Atmospheric Research 100:621-637 http://dx.doi.org/10.1016/j.atmosres.2010.12.024 Bech J, R Pascual, T Rigo, N Pineda, JM López, J Arús, and M Gayà, 2007: An observational study of the 7 September 2005 Barcelona tornado outbreak. Natural Hazards and Earth System Science 7:129-139 http://dx.doi.org/10.5194/nhess-7-129-2007 Berenguer M, C Corral, R Sa'nchez-Diezma, D Sempere-Torres, 2005: Hydrological validation of a radarbased nowcasting technique. Journal of Hydrometeorology 6: 532-549 http://dx.doi.org/10.1175/JHM433.1 Berenguer M, D Sempere, G Pegram, 2011: SBMcast - An ensemble nowcasting technique to assess the uncertainty in rainfall forecasts by Lagrangian extrapolation. Journal of Hydrology 404: 226-240 http://dx.doi.org/10.1016/j.jhydrol.2011.04.033 Pierce C, A Seed, S Ballard, D Simonin, Z Li, 2012: Nowcasting. In Doppler Radar Observations (J Bech, JL Chau, ed.) Ch. 13, 98-142. InTech, Rijeka, Croatia http://dx.doi.org/10.5772/39054

  11. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance and...-2011-0254. Discussion On November 2, 2011 (76 FR 67764), the NRC published for public comment...

  12. Arrests, Recent Life Circumstances, and Recurrent Job Loss for At-Risk Young Men: An Event-History Analysis

    ERIC Educational Resources Information Center

    Wiesner, Margit; Capaldi, Deborah M.; Kim, Hyoun K.

    2010-01-01

    This study used longitudinal data from 202 at-risk young men to examine effects of arrests, prior risk factors, and recent life circumstances on job loss across a 7-year period in early adulthood. Repeated failure-time continuous event-history analysis indicated that occurrence of job loss was primarily related to prior mental health problems,…

  13. Time-to-Event Analysis of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study

    ERIC Educational Resources Information Center

    Dante, Angelo; Fabris, Stefano; Palese, Alvisa

    2013-01-01

    Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…

  14. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    NASA Astrophysics Data System (ADS)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  15. Robust non-parametric one-sample tests for the analysis of recurrent events.

    PubMed

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. PMID:21170908

  16. Multi-instrumental analysis of large sprite events and their producing storm in southern France

    NASA Astrophysics Data System (ADS)

    Soula, S.; Iacovella, F.; van der Velde, O.; Montanyà, J.; Füllekrug, M.; Farges, T.; Bór, J.; Georgis, J.-F.; NaitAmor, S.; Martin, J.-M.

    2014-01-01

    During the night of 01-02 September, 2009, seventeen distinct sprite events including 3 halos were observed above a storm in north-western Mediterranean Sea, with a video camera at Pic du Midi (42.93N; 0.14E; 2877 m). The sprites occurred at distances between 280 and 390 km which are estimated based on their parent CG location. The MCS-type storm was characterized by a trailing-stratiform structure and a very circular shape with a size of about 70,000 km2 (cloud top temperature lower than - 35 °C) when the TLEs were observed. The cloud to ground (CG) flash rate was large (45 min- 1) one hour before the TLE observation and very low (< 5 min- 1) during it. Out of the 17 sprite events, 15 parent + CG (P + CG) strokes have been identified and their average peak current is 87 kA (67 kA for the 14 events without halo), while the associated charge moment changes (CMC) that could be determined, range from 424 to 2088 ± 20% C km. Several 2-second videos contain multiple sprite events: one with four events, one with three events and three with two events. Column and carrot type sprites are identified, either together or separately. All P + CG strokes are clearly located within the stratiform region of the storm and the second P + CG stroke of a multiple event is back within the stratiform region. Groups of large and bright carrots reach ~ 70 km height and ~ 80 km horizontal extent. These groups are associated with a second pulse of electric field radiation in the ELF range which occurs ~ 5 ms after the P + CG stroke and exhibits the same polarity, which is evidence for current in the sprite body. VLF perturbations associated with the sprite events were recorded with a station in Algiers.

  17. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage (FOD) Events

    NASA Technical Reports Server (NTRS)

    Turso, James; Lawrence, Charles; Litt, Jonathan

    2004-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  18. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage "FOD" Events

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Lawrence, Charles; Litt, Jonathan S.

    2007-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  19. -field data exploitation: drought and floods events analysis using the MEA platform

    NASA Astrophysics Data System (ADS)

    Natali, Stefano; Scremin, Alessandro; Mantovani, Simone; Folegani, Marco

    2014-05-01

    Since the launch of the first artificial satellite with the scope of observing the Earth - Atmosphere system, the amount of data retrieved from space platform has grown continuously: nowadays Earth Observation data collected from space platforms provide Gigabyte of data per year, and cover all the geophysics fields. Nevertheless, most of the scientific and application communities (land, ocean, atmosphere, hydrology, vegetation and so on) have worked separately (or with just few contacts) for tenths of years, developing sensors, algorithms, data formats, and datasets (Petabytes of data) in an almost-independent way. The need of jointly use data coming from the different communities and from different data sources (such as EO products and on-ground data) to allow multi-disciplinary studies has been recognized by the European Space Agency since 2008: the Multi-sensor Evolution Analysis (MEA) platform (https://mea.eo.esa.int/) has been developed with the scope of demonstrating the feasibility of an advanced tool for long term multi-field / multi-resolution / multi-temporal data management system, and has been used by the FP7 Project EarthServer to exploit long time series of EO and on-ground retrieved climate data. MEA is now available for multi-temporal and multi-field data visualization and exploitation, containing tenths of Terabytes of data from the land and atmosphere domains (https://mea.eo.esa.int/data_availability.html) allowing users integrating the modeling approach with an intensive data exploitation approach. In the present work, the usability of the MEA platform is described, and some use cases to demonstrate the combined use of atmospheric (precipitation), vegetation (NDVI) and soil (soil moisture) data are provided for drought and flooding events.

  20. Time-frequency analysis of the event-related potentials associated with the Stroop test.

    PubMed

    Ergen, Mehmet; Saban, Sara; Kirmizi-Alsan, Elif; Uslu, Atilla; Keskin-Ergen, Yasemin; Demiralp, Tamer

    2014-12-01

    Multiple executive processes are suggested to be engaged at Stroop test, and time-frequency analysis is acknowledged to improve the informative utility of EEG in cognitive brain research. We aimed to investigate event-related oscillations associated with the Stroop test. EEG data was collected from 23 healthy volunteers while they performed a computer version of Stroop test. Both evoked (phase-locked) and total (phase-locked+non-phase-locked) oscillatory responses in the EEG were analyzed by wavelet transform. Data from the congruent (color-word matching) and incongruent stimuli (color-word non-matching) conditions are compared. In the incongruent condition, N450 wave was more negative and amplitude of the late slow wave was more positive. In the time-frequency plane, the fronto-central total theta amplitude (300-700 ms) was larger in the incongruent condition. The evoked delta (250-600 ms) was larger in the congruent condition particularly over parieto-occipital regions. The larger frontal theta response in the incongruent condition was associated with the detection of interference and inhibition of the response to task-irrelevant features, while the larger evoked delta in the congruent condition was suggestive of the easier decision process owing to congruency between the physical attribute and the verbal meaning of the stimuli. Furthermore, in the incongruent condition, amplitude of the occipital total alpha in the very late phase (700-900 ms) was smaller. This prolonged desynchronization in the alpha band could be reflecting augmentation of attentional filters in visual modality for the next stimulus. These multiple findings on EEG time-frequency plane provide improved description of the overlapping processes in Stroop test. PMID:25135670

  1. Negative life events and depression in adolescents with HIV: a stress and coping analysis.

    PubMed

    Lewis, Jennifer V; Abramowitz, Susan; Koenig, Linda J; Chandwani, Sulachni; Orban, Lisa

    2015-01-01

    The prevalence of negative life events (NLE) and daily hassles, and their direct and moderated associations with depression, were examined among HIV-infected adolescents. Specifically, we examined whether the negative association with depression of NLE, daily hassles, and/or passive coping were moderated by social support or active coping strategies. Demographic characteristics, depression, coping, social support, NLE, and daily hassles were collected at baseline as part of the Adolescent Impact intervention via face-to-face and computer-assisted interviews. Of 166 HIV-infected adolescents, 53% were female, 72.9% black, 59.6% with perinatally acquired HIV (PIY), the most commonly reported NLE were death in family (81%), violence exposure (68%), school relocation (67%), and hospitalization (61%); and for daily hassles "not having enough money (65%)". Behaviorally infected youth (BIY--acquired HIV later in life) were significantly more likely to experience extensive (14-21) lifetime NLE (38.8% vs. 16.3%, p < .012) than PIY. In multiple stepwise regression analysis, the model accounting for the greatest variability in depression scores (32%) included (in order of entry): daily hassles, low social support, behaviorally acquired HIV, minority sexual orientation, and passive coping. A significant passive coping-by-social support interaction revealed that the association between passive coping and depression was exacerbated when social support was low. Social support moderated the effect of NLE, such that NLE were associated with greater depression when social support was low, although the effect did not remain statistically significant when main effects of other variables were accounted for. Daily hassles, poor coping, and limited social support can adversely affect the psychological well-being of HIV-infected adolescents, particularly sexual minority youth with behaviorally acquired HIV. Multimodal interventions that enhance social support and teach adaptive coping

  2. Tracing footprints of environmental events in tree ring chemistry using neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Sahin, Dagistan

    The aim of this study is to identify environmental effects on tree-ring chemistry. It is known that industrial pollution, volcanic eruptions, dust storms, acid rain and similar events can cause substantial changes in soil chemistry. Establishing whether a particular group of trees is sensitive to these changes in soil environment and registers them in the elemental chemistry of contemporary growth rings is the over-riding goal of any Dendrochemistry research. In this study, elemental concentrations were measured in tree-ring samples of absolutely dated eleven modern forest trees, grown in the Mediterranean region, Turkey, collected and dated by the Malcolm and Carolyn Wiener Laboratory for Aegean and Near Eastern Dendrochronology laboratory at Cornell University. Correlations between measured elemental concentrations in the tree-ring samples were analyzed using statistical tests to answer two questions. Does the current concentration of a particular element depend on any other element within the tree? And, are there any elements showing correlated abnormal concentration changes across the majority of the trees? Based on the detailed analysis results, the low mobility of sodium and bromine, positive correlations between calcium, zinc and manganese, positive correlations between trace elements lanthanum, samarium, antimony, and gold within tree-rings were recognized. Moreover, zinc, lanthanum, samarium and bromine showed strong, positive correlations among the trees and were identified as possible environmental signature elements. New Dendrochemistry information found in this study would be also useful in explaining tree physiology and elemental chemistry in Pinus nigra species grown in Turkey. Elemental concentrations in tree-ring samples were measured using Neutron Activation Analysis (NAA) at the Pennsylvania State University Radiation Science and Engineering Center (RSEC). Through this study, advanced methodologies for methodological, computational and

  3. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    SciTech Connect

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  4. Two damaging hydrogeological events in Calabria, September 2000 and November 2015. Comparative analysis of causes and effects

    NASA Astrophysics Data System (ADS)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela

    2016-04-01

    Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and

  5. Meta-Analysis of Mental Stress—Induced Myocardial Ischemia and Subsequent Cardiac Events in Patients With Coronary Artery Disease

    PubMed Central

    Wei, Jingkai; Rooks, Cherie; Ramadan, Ronnie; Shah, Amit J.; Bremner, J. Douglas; Quyyumi, Arshed A.; Kutner, Michael; Vaccarino, Viola

    2014-01-01

    Mental stress—induced myocardial ischemia (MSIMI) has been associated with adverse prognosis in patients with coronary artery disease (CAD), but whether this is a uniform finding across different studies has not been described. We conducted a systematic review and meta-analysis of prospective studies examining the association between MSIMI and adverse outcome events in patients with stable CAD. We searched PubMed, EMBASE, Web of Science, and PsycINFO databases for English language prospective studies of patients with CAD who underwent standardized mental stress testing to determine presence of MSIMI and were followed up for subsequent cardiac events or total mortality. Our outcomes of interest were CAD recurrence, CAD mortality, or total mortality. A summary effect estimate was derived using a fixed-effects meta-analysis model. Only 5 studies, each with a sample size of <200 patients and fewer than 50 outcome events, met the inclusion criteria. The pooled samples comprised 555 patients with CAD (85% male) and 117 events with a range of follow-up from 35 days to 8.8 years. Pooled analysis showed that MSIMI was associated with a twofold increased risk of a combined end point of cardiac events or total mortality (relative risk 2.24, 95% confidence interval 1.59 to 3.15). No heterogeneity was detected among the studies (Q = 0.39, I2 = 0.0%, p = 0.98). In conclusion, although few selected studies have examined the association between MSIMI and adverse events in patients with CAD, all existing investigations point to approximately a doubling of risk. Whether this increased risk is generalizable to the CAD population at large and varies in patient subgroups warrant further investigation. PMID:24856319

  6. Predictors of seeking emergency medical help during overdose events in a provincial naloxone distribution programme: a retrospective analysis

    PubMed Central

    Ambrose, Graham; Amlani, Ashraf; Buxton, Jane A

    2016-01-01

    Objectives This study sought to identify factors that may be associated with help-seeking by witnesses during overdoses where naloxone is administered. Setting Overdose events occurred in and were reported from the five regional health authorities across British Columbia, Canada. Naloxone administration forms completed following overdose events were submitted to the British Columbia Take Home Naloxone programme. Participants All 182 reported naloxone administration events, reported by adult men and women and occurring between 31 August 2012 and 31 March 2015, were considered for inclusion in the analysis. Of these, 18 were excluded: 10 events which were reported by the person who overdosed, and 8 events for which completed forms did not indicate whether or not emergency medical help was sought. Primary and secondary outcome measures Seeking emergency medical help (calling 911), as reported by participants, was the sole outcome measure of this analysis. Results Medical help was sought (emergency services—911 called) in 89 (54.3%) of 164 overdoses where naloxone was administered. The majority of administration events occurred in private residences (50.6%) and on the street (23.4%), where reported rates of calling 911 were 27.5% and 81.1%, respectively. Overdoses occurring on the street (compared to private residence) were significantly associated with higher odds of calling 911 in multivariate analysis (OR=10.68; 95% CI 2.83 to 51.87; p<0.01), after adjusting for other variables. Conclusions Overdoses occurring on the street were associated with higher odds of seeking emergency medical help by responders. Further research is needed to determine if sex and stimulant use by the person who overdosed are associated with seeking emergency medical help. The results of this study will inform interventions within the British Columbia Take Home Naloxone programme and other jurisdictions to encourage seeking emergency medical help. PMID:27329442

  7. Low Probability Tail Event Analysis and Mitigation in the BPA Control Area

    SciTech Connect

    Lu, Shuai; Brothers, Alan J.; McKinstry, Craig A.; Jin, Shuangshuang; Makarov, Yuri V.

    2010-10-31

    This report investigated the uncertainties with the operations of the power system and their contributions to tail events, especially under high penetration of wind. A Bayesian network model is established to quantify the impact of these uncertainties on system imbalance. The framework is presented for a decision support tool, which can help system operators better estimate the need for balancing reserves and prepare for tail events.

  8. Analysis of geohazards events along Swiss roads from autumn 2011 to present

    NASA Astrophysics Data System (ADS)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    In Switzerland, roads and railways are threatened throughout the year by several natural hazards. Some of these events reach transport infrastructure many time per year leading to the closing of transportation corridors, loss of access, deviation travels and sometimes infrastructures damages and loss of human lives (3 fatalities during the period considered). The aim of this inventory of events is to investigate the number of natural events affecting roads and railways in Switzerland since autumn 2011 until now. Natural hazards affecting roads and railway can be classified in five categories: rockfalls, landslides, debris flows, snow avalanches and floods. They potentially cause several important direct damages on transportation infrastructure (roads, railway), vehicles (slightly or very damaged) or human life (slightly or seriously injured person, death). These direct damages can be easily evaluated from press articles or from Swiss police press releases. Indirect damages such as deviation cost are not taken into account in this work. During the two a half last years, about 50 events affecting the Swiss roads and Swiss railways infrastructures were inventoried. The proportion of events due to rockfalls is 45%, to landslides 25%, to debris flows 15%, to snow avalanches 10% and to floods 5%. During this period, three fatalities and two persons were injured while 23 vehicles (car, trains and coach) and 24 roads and railways were damaged. We can see that floods occur mainly on the Swiss Plateau whereas rockfalls, debris flow, snow avalanches and landslides are mostly located in the Alpine area. Most of events occur on secondary mountain roads and railways. The events are well distributed on the whole Alpine area except for the Gotthard hotspot, where an important European North-South motorway (hit in 2003 with two fatalities) and railway (hit three times in 2012 with one fatalities) are more frequently affected. According to the observed events in border regions of

  9. Non-linear time series analysis of precipitation events using regional climate networks for Germany

    NASA Astrophysics Data System (ADS)

    Rheinwalt, Aljoscha; Boers, Niklas; Marwan, Norbert; Kurths, Jürgen; Hoffmann, Peter; Gerstengarbe, Friedrich-Wilhelm; Werner, Peter

    2016-02-01

    Synchronous occurrences of heavy rainfall events and the study of their relation in time and space are of large socio-economical relevance, for instance for the agricultural and insurance sectors, but also for the general well-being of the population. In this study, the spatial synchronization structure is analyzed as a regional climate network constructed from precipitation event series. The similarity between event series is determined by the number of synchronous occurrences. We propose a novel standardization of this number that results in synchronization scores which are not biased by the number of events in the respective time series. Additionally, we introduce a new version of the network measure directionality that measures the spatial directionality of weighted links by also taking account of the effects of the spatial embedding of the network. This measure provides an estimate of heavy precipitation isochrones by pointing out directions along which rainfall events synchronize. We propose a climatological interpretation of this measure in terms of propagating fronts or event traces and confirm it for Germany by comparing our results to known atmospheric circulation patterns.

  10. Automatic Prediction of Cardiovascular and Cerebrovascular Events Using Heart Rate Variability Analysis

    PubMed Central

    Melillo, Paolo; Izzo, Raffaele; Orrico, Ada; Scala, Paolo; Attanasio, Marcella; Mirra, Marco; De Luca, Nicola; Pecchia, Leandro

    2015-01-01

    Background There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients. Methods A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were collected ad hoc. Subjects who experienced a vascular event (i.e., myocardial infarction, stroke, syncopal event) were considered as high-risk subjects. Several data-mining algorithms (such as support vector machine, tree-based classifier, artificial neural network) were used to develop automatic classifiers and their accuracy was tested by assessing the receiver-operator characteristics curve. Moreover, we tested the echographic parameters, which have been showed as powerful predictors of future vascular events. Results The best predictive model was based on random forest and enabled to identify high-risk hypertensive patients with sensitivity and specificity rates of 71.4% and 87.8%, respectively. The Heart Rate Variability based classifier showed higher predictive values than the conventional echographic parameters, which are considered as significant cardiovascular risk factors. Conclusions Combination of Heart Rate Variability measures, analyzed with data-mining algorithm, could be a reliable tool for identifying hypertensive patients at high risk to develop future vascular events. PMID:25793605

  11. How does leaving home affect marital timing? An event-history analysis of migration and marriage in Nang Rong, Thailand.

    PubMed

    Jampaklay, Aree

    2006-11-01

    This study examines the effects of migration on marital timing in Thailand between 1984 and 2000 using prospective and retrospective survey data from Nang Rong. In contrast to previous results in the literature, event-history analysis of the longitudinal data reveals a positive, not a negative, effect of lagged migration experience on the likelihood of marriage. The findings also indicate gender differences. Migration's positive impact is independent of other life events for women but is completely "explained" by employment for men. PMID:17236543

  12. Multivariate spatial analysis of a heavy rain event in a densely populated delta city

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick

    2014-05-01

    Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain events. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain event triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral

  13. Association between use of warfarin with common sulfonylureas and serious hypoglycemic events: retrospective cohort analysis

    PubMed Central

    Romley, John A; Gong, Cynthia; Jena, Anupam B; Goldman, Dana P; Williams, Bradley

    2015-01-01

    Study question Is warfarin use associated with an increased risk of serious hypoglycemic events among older people treated with the sulfonylureas glipizide and glimepiride? Methods This was a retrospective cohort analysis of pharmacy and medical claims from a 20% random sample of Medicare fee for service beneficiaries aged 65 years or older. It included 465 918 beneficiaries with diabetes who filled a prescription for glipizide or glimepiride between 2006 and 2011 (4 355 418 person quarters); 71 895 (15.4%) patients also filled a prescription for warfarin (416 479 person quarters with warfarin use). The main outcome measure was emergency department visit or hospital admission with a primary diagnosis of hypoglycemia in person quarters with concurrent fills of warfarin and glipizide/glimepiride compared with the rates in quarters with glipizide/glimepiride fills only, Multivariable logistic regression was used to adjust for individual characteristics. Secondary outcomes included fall related fracture and altered consciousness/mental status. Summary answer and limitations In quarters with glipizide/glimepiride use, hospital admissions or emergency department visits for hypoglycemia were more common in person quarters with concurrent warfarin use compared with quarters without warfarin use (294/416 479 v 1903/3 938 939; adjusted odds ratio 1.22, 95% confidence interval 1.05 to 1.42). The risk of hypoglycemia associated with concurrent use was higher among people using warfarin for the first time, as well as in those aged 65-74 years. Concurrent use of warfarin and glipizide/glimepiride was also associated with hospital admission or emergency department visit for fall related fractures (3919/416 479 v 20 759/3 938 939; adjusted odds ratio 1.47, 1.41 to 1.54) and altered consciousness/mental status (2490/416 479 v 14 414/3 938 939; adjusted odds ratio 1.22, 1.16 to 1.29). Unmeasured factors could be correlated with both warfarin use and

  14. On the analysis of an extreme Bora wind event over the northern Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Colucci, R. R.; Pucillo, A.

    2010-09-01

    On 10th March 2010 a severe Bora wind event affected the Friuli Venezia Giulia region, northeastern Italy, in particular the gulf of Trieste area (northern Adriatic Sea). Such event has been driven by a widespread westward moving cold pool aloft, coming from the Western Asia, that brought an intense potential vorticity anomaly over the western Mediterranean Sea. It determined a deep cyclogenesis involving all the troposphere. The pressure gradient force in the lowest layers forced a northeastern wind to blow with noticeable strength over the gulf of Trieste area and the Karstic region. The mean ground wind velocity has reached values above 27 m/s (about 100 km/h) for several hours, and maximum gusts exceeded 42 m/s (about 150 km/h) over Trieste town. The northeastern sector of the Adriatic Sea is frequently affected by strong Bora events in particular during the winter semester. This is a characteristic local wind mostly influenced by the orography of the Karstic relieves to the east of Trieste town. The target of this work is to assess the climatological relevance of such an event by comparing it with the most representative events of the past. It has been possible thanks to the long term archive of meteorological observations in Trieste site (I.R. Accademia di Commercio e Nautica, Regio Comitato Talassografico Italiano, Ministero dell'Agricoltura e Foreste, Consiglio Nazionale delle Ricerche): we have found out that this is one of the ten strongest Bora event along the 1871-2010 period. Considerations about the trend and frequency of severe Bora events have been proposed.

  15. Survival analysis for recurrent event data: an application to childhood infectious diseases.

    PubMed

    Kelly, P J; Lim, L L

    2000-01-15

    Many extensions of survival models based on the Cox proportional hazards approach have been proposed to handle clustered or multiple event data. Of particular note are five Cox-based models for recurrent event data: Andersen and Gill (AG); Wei, Lin and Weissfeld (WLW); Prentice, Williams and Peterson, total time (PWP-CP) and gap time (PWP-GT); and Lee, Wei and Amato (LWA). Some authors have compared these models by observing differences that arise from fitting the models to real and simulated data. However, no attempt has been made to systematically identify the components of the models that are appropriate for recurrent event data. We propose a systematic way of characterizing such Cox-based models using four key components: risk intervals; baseline hazard; risk set, and correlation adjustment. From the definitions of risk interval and risk set there are conceptually seven such Cox-based models that are permissible, five of which are those previously identified. The two new variant models are termed the 'total time - restricted' (TT-R) and 'gap time - unrestricted' (GT-UR) models. The aim of the paper is to determine which models are appropriate for recurrent event data using the key components. The models are fitted to simulated data sets and to a data set of childhood recurrent infectious diseases. The LWA model is not appropriate for recurrent event data because it allows a subject to be at risk several times for the same event. The WLW model overestimates treatment effect and is not recommended. We conclude that PWP-GT and TT-R are useful models for analysing recurrent event data, providing answers to slightly different research questions. Further, applying a robust variance to any of these models does not adequately account for within-subject correlation. PMID:10623910

  16. Relationship between catchment events (earthquake and heavy rain) and sediment core analysis result in Taiwan.

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Ying; Lin, Jiun-Chuan

    2015-04-01

    Lake sediments contains material from the catchment. In those sediments, there are some features which can indicate characteristic or status of the catchment. These features were formed by different mechanisms, including some events like earthquakes or heavy rain, which are very common in Taiwan. By analyzing and discussing features of sediments there is a chance to identify historical events and rebuild catchment history. In this study, we compare features of sediment core ( including density, mineral grain size, whole grain size, and biogenic silica content) and earthquake, precipitation records. Sediment cores are collected from Emerald peak lake (24.514980, 121.605844; 77.5, 77.2, 64cm depth), Liyutan lake (23.959878, 120.996585; 43.2, 78.1 cm depth), Sun Moon Lake (23.847043, 120.909869; 181 cm depth), and Dongyuan lake (22.205742, 120.854984; 45.1, 44.2cm depth) in 2014. We assume that there are regular material and organic output in catchments. And rain will provide impetus to move material into lakes. The greater the rain is the larger the material can move. So, if there is a heavy rainfall event, grain size of lake sediment may increase. However, when earthquakes happen, it will produce more material which have lower organic composition than ordinary. So we suggest that after earthquakes there will be more material stored in catchment than often. And rainfall event provides power to move material into lakes, cause more sediment and mineral content higher than usual. Comparing with earthquake record(from 1949, by USGS) and precipitation record(from1940, by Central Weather Bureau,Taiwan), there were few earthquakes which happened near lakes and scale were more than 7 ML. There were 28 rainfall events near Emerald peak lake; 32 near Liyutan lake and Sun Moon Lake; 58 near Dongyuan lake ( rainfall event: >250 mm/day ). In sediment analytical results, ratio of whole and mineral grain size indeed have similar trends with earthquake record. However, rainfall

  17. The usefulness of the Global Navigation Satellite Systems (GNSS) in the analysis of precipitation events

    NASA Astrophysics Data System (ADS)

    Bonafoni, Stefania; Biondi, Riccardo

    2016-01-01

    It is well known that the use of the Global Navigation Satellite Systems (GNSS), both with ground-based and Low Earth Orbit (LEO) receivers, allows retrieving atmospheric parameters in all the weather conditions. Ground-based GNSS technique provides the integrated precipitable water vapour (IPWV) with temporal continuity at a specific receiver station, while the GNSS LEO technique allows for Radio Occultation (RO) observations of the atmosphere, providing a detailed atmospheric profiling but without temporal continuity at a specific site. In this work, several precipitation events that occurred in Italy were analysed exploiting the potential of the two GNSS techniques (i.e. ground-based and space-based GNSS receivers). From ground-based receivers, time series of IPWV were produced at specific locations with the purpose of analysing the water vapour behaviour during precipitation events. From LEO receivers, the profiling potential was exploited to retrieve the cloud top altitude of convective events, taking into account that although GNSS RO could capture the dynamics of the atmosphere with high vertical resolution, the temporal resolution is not enough to continuously monitor such an event in a local area. Therefore, the GNSS technique can be considered as a supplemental meteorological system useful in studying precipitation events, but with very different spatial and temporal features depending on the receiver positioning.

  18. Using Topological Analysis to Support Event-Guided Exploration in Urban Data.

    PubMed

    Doraiswamy, Harish; Ferreira, Nivan; Damoulas, Theodoros; Freire, Juliana; Silva, Cláudio T

    2014-12-01

    The explosion in the volume of data about urban environments has opened up opportunities to inform both policy and administration and thereby help governments improve the lives of their citizens, increase the efficiency of public services, and reduce the environmental harms of development. However, cities are complex systems and exploring the data they generate is challenging. The interaction between the various components in a city creates complex dynamics where interesting facts occur at multiple scales, requiring users to inspect a large number of data slices over time and space. Manual exploration of these slices is ineffective, time consuming, and in many cases impractical. In this paper, we propose a technique that supports event-guided exploration of large, spatio-temporal urban data. We model the data as time-varying scalar functions and use computational topology to automatically identify events in different data slices. To handle a potentially large number of events, we develop an algorithm to group and index them, thus allowing users to interactively explore and query event patterns on the fly. A visual exploration interface helps guide users towards data slices that display interesting events and trends. We demonstrate the effectiveness of our technique on two different data sets from New York City (NYC): data about taxi trips and subway service. We also report on the feedback we received from analysts at different NYC agencies. PMID:26356977

  19. A network of discrete events for the representation and analysis of diffusion dynamics

    NASA Astrophysics Data System (ADS)

    Pintus, Alberto M.; Pazzona, Federico G.; Demontis, Pierfranco; Suffritti, Giuseppe B.

    2015-11-01

    We developed a coarse-grained description of the phenomenology of diffusive processes, in terms of a space of discrete events and its representation as a network. Once a proper classification of the discrete events underlying the diffusive process is carried out, their transition matrix is calculated on the basis of molecular dynamics data. This matrix can be represented as a directed, weighted network where nodes represent discrete events, and the weight of edges is given by the probability that one follows the other. The structure of this network reflects dynamical properties of the process of interest in such features as its modularity and the entropy rate of nodes. As an example of the applicability of this conceptual framework, we discuss here the physics of diffusion of small non-polar molecules in a microporous material, in terms of the structure of the corresponding network of events, and explain on this basis the diffusivity trends observed. A quantitative account of these trends is obtained by considering the contribution of the various events to the displacement autocorrelation function.

  20. Towards a unified study of extreme events using universality concepts and transdisciplinary analysis methods

    NASA Astrophysics Data System (ADS)

    Balasis, George; Donner, Reik V.; Donges, Jonathan F.; Radebach, Alexander; Eftaxias, Konstantinos; Kurths, Jürgen

    2013-04-01

    The dynamics of many complex systems is characterized by the same universal principles. In particular, systems which are otherwise quite different in nature show striking similarities in their behavior near tipping points (bifurcations, phase transitions, sudden regime shifts) and associated extreme events. Such critical phenomena are frequently found in diverse fields such as climate, seismology, or financial markets. Notably, the observed similarities include a high degree of organization, persistent behavior, and accelerated energy release, which are common to (among others) phenomena related to geomagnetic variability of the terrestrial magnetosphere (intense magnetic storms), seismic activity (electromagnetic emissions prior to earthquakes), solar-terrestrial physics (solar flares), neurophysiology (epileptic seizures), and socioeconomic systems (stock market crashes). It is an open question whether the spatial and temporal complexity associated with extreme events arises from the system's structural organization (geometry) or from the chaotic behavior inherent to the nonlinear equations governing the dynamics of these phenomena. On the one hand, the presence of scaling laws associated with earthquakes and geomagnetic disturbances suggests understanding these events as generalized phase transitions similar to nucleation and critical phenomena in thermal and magnetic systems. On the other hand, because of the structural organization of the systems (e.g., as complex networks) the associated spatial geometry and/or topology of interactions plays a fundamental role in the emergence of extreme events. Here, a few aspects of the interplay between geometry and dynamics (critical phase transitions) that could result in the emergence of extreme events, which is an open problem, will be discussed.

  1. Definition of Stratospheric Sudden Warming Events for Multi-Model Analysis and Its Application to the CMIP5

    NASA Astrophysics Data System (ADS)

    Kim, Junsu; Son, Seok-Woo; Park, Hyo-Seok

    2015-04-01

    The onset of major stratospheric sudden warming (SSW) events has been often defined as the date when the westerly at 10 hPa and 60°N turns to easterly during winter, corresponding to warmer polar stratosphere than mid latitudes. This simple definition effectively detects the observed characteristics of SSW, but its application to climate models, which have different background flow and temporal variability, is often challenging. For example, the model whose stratospheric mean wind is too weak tends to overestimate the frequency of zonal-wind reversal and SSW events. In this study we propose a simple definition of major SSW events that is applicable to multi-model analysis. Specifically, SSW events are defined when the tendency of zonal-mean zonal wind at 10 hPa at 60°N crosses -1 m/s/day within 30 to 40 days while growing in magnitude. This tendency-based definition, which is independent of mean wind, is applied to both ERA40 reanalysis and CMIP5 models. The models are further grouped into the high-top models with a well-resolved stratosphere and low-top models with a relatively simple stratosphere. A new definition successfully reproduces the mean frequency of SSW events that is identified by wind reversal approach, i.e., about 6 events per decade in ERA40. High-top models well capture this frequency. Although low-top models underestimate the frequency, in contrast to previous studies, the difference to high-top models is not statistically significant. Likewise, no significant difference is found in the downward coupling in the high-top and low-top models. These results indicate that model vertical resolution itself may not be a key factor in simulating SSW events and the associated downward coupling.

  2. Therapeutic potential and adverse events of everolimus for treatment of hepatocellular carcinoma - systematic review and meta-analysis.

    PubMed

    Yamanaka, Kenya; Petrulionis, Marius; Lin, Shibo; Gao, Chao; Galli, Uwe; Richter, Susanne; Winkler, Susanne; Houben, Philipp; Schultze, Daniel; Hatano, Etsuro; Schemmer, Peter

    2013-12-01

    Everolimus is an orally administrated mammalian target of rapamycin (mTOR) inhibitor. Several large-scale randomized controlled trials (RCTs) have demonstrated the survival benefits of everolimus at the dose of 10 mg/day for solid cancers. Furthermore, mTOR-inhibitor-based immunosuppression is associated with survival benefits for patients with hepatocellular carcinoma (HCC) who have received liver transplantation. However, a low rate of tumor reduction and some adverse events have been pointed out. This review summarizes the antitumor effects and adverse events of everolimus and evaluates its possible application in advanced HCC. For the meta-analysis of adverse events, we used the RCTs for solid cancers. The odds ratios of adverse events were calculated using the Peto method. Manypreclinical studies demonstrated that everolimus had antitumor effects such as antiproliferation and antiangiogenesis. However, some differences in the effects were observed among in vivo animal studies for HCC treatment. Meanwhile, clinical studies demonstrated that the response rate of single-agent everolimus was low, though survival benefits could be expected. The meta-analysis revealed the odds ratios (95% confidence interval [CI]) of stomatitis: 5.42 [4.31-6.73], hyperglycemia: 3.22 [2.37-4.39], anemia: 3.34 [2.37-4.67], pneumonitis: 6.02 [3.95-9.16], aspartate aminotransferase levels: 2.22 [1.37-3.62], and serum alanine aminotransferase levels: 2.94 [1.72-5.02], respectively. Everolimus at the dose of 10 mg/day significantly increased the risk of the adverse events. In order to enable its application to the standard conventional therapies of HCC, further studies are required to enhance the antitumor effects and manage the adverse events of everolimus. PMID:24403259

  3. Analysis of Atmospheric Mercury Depletion Events during the NASA ARCTAS Field Campaign

    NASA Astrophysics Data System (ADS)

    Kim, S.; Talbot, R. W.; Mao, H.

    2009-12-01

    Atmospheric Mercury Depletion Events (MDEs) occur in Arctic springtime, but they have only been observed from ground-based measurements. We utilized the extensive measurement database from the NASA DC-8 flights in spring 2008 during the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) and a box model customized for mercury chemistry to improve understanding of MDEs. We selected 8 cases for study which had low O3 (<10 ppbv), low Hg(0) (<50 ppqv), and high Br2 (>2 pptv). MDEs were always observed over the Arctic Ocean (70-90N latitude) or along the shoreline, and near the ice/snow surface (<0.7 km). The aircraft sampled MDEs with a time frame of 4 - 30 minutes, demonstrating their occurrence over wide oceanic regions covering 225 km. Five-day backward trajectories indicated that air masses over the MDEs were transported mainly at surface level over the Arctic. In addition to Hg(0) and O3, the hydrocarbons C2H2, C2H6, C3H8, C4H10 and C5H12 also showed decreased mixing ratios in the MDE regions. In MDE regions correlations of these species with Hg(0) and O3 indicated strong relationships (e.g., r2 = 0.73 of O3-C2H2 for MDE). Very high NOx mixing ratios (up to 5 ppbv) in a couple of cases implied fresh emission influences from the Prudhoe Bay area. We conducted box modeling of each case to understand the role of halogen compounds during MDEs. Initial conditions of chemical species and photolysis rate constants were taken from flight data. Our analysis of the simulations indicates that bromine chemistry is very important for Hg(0) depletion while O3 depletion is impacted by both bromine and chlorine chemistry. Furthermore, the different environments sampled and corresponding simulations indicate that photolysis rate constants predominantly affect the time to reach total depletion of Hg(0) and O3. We are conducting more in-depth analyses of the field data and simulation results.

  4. Inverse modeling of storm intensity based on grain-size analysis of hurricane-induced event beds

    NASA Astrophysics Data System (ADS)

    Castagno, K. A.; Donnelly, J. P.

    2015-12-01

    As the coastal population continues to grow in size and wealth, increased hurricane frequency and intensity present a growing threat of property damage and loss of life. Recent reconstructions of past intense-hurricane landfalls from sediment cores in southeastern New England identify a series of active intervals over the past 2,000 years, with the last few centuries among the most quiescent intervals. The frequency of intense-hurricane landfalls in southeastern New England is well constrained, but the intensity of these storms, particularly prehistoric events, is not. This study analyzes the grain sizes of major storm event beds along a transect of sediment cores in Salt Pond, Falmouth, MA. Several prehistoric events contain more coarse material than any of the deposits from the historical interval, suggesting that landfalling hurricanes in the northeastern United States may have been more intense than the historically encountered category 2 and 3 storms. The intensity of major storm events is estimated using grain-size analysis with a digital image processing, size, and shape analyzer. Since event deposits in Salt Pond result from a combination of coastal inundation and wave action, a large population of both historical and synthetic storms is used to assess the storm characteristics that could result in the wave heights inversely modeled from grain size trends. Intense-hurricane activity may be closely tied to warming in sea surface temperature. As such, the prehistoric intervals of increased frequency and intensity provide potential analogs for current and future hurricane risk in the northeastern United States.

  5. Tracking the evolution of stream DOM source during storm events using end member mixing analysis based on DOM quality

    NASA Astrophysics Data System (ADS)

    Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin

    2015-04-01

    The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.

  6. Readiness of the ATLAS Spanish Federated Tier-2 for the Physics Analysis of the early collision events at the LHC

    NASA Astrophysics Data System (ADS)

    Oliver, E.; Nadal, J.; Pardo, J.; Amorós, G.; Borrego, C.; Campos, M.; Del Cano, L.; Del Peso, J.; Espinal, X.; Fassi, F.; Fernández, A.; Fernández, P.; González, S.; Kaci, M.; Lamas, A.; March, L.; Muñoz, L.; Pacheco, A.; Salt, J.; Sánchez, J.; Villaplana, M.; Vives, R.

    2010-04-01

    In this contribution an evaluation of the readiness parameters for the Spanish ATLAS Federated Tier-2 is presented, regarding the ATLAS data taking which is expected to start by the end of the year 2009. Special attention will be paid to the Physics Analysis from different points of view: Data Management, Simulated events Production and Distributed Analysis Tests. Several use cases of Distributed Analysis in GRID infrastructures and local interactive analysis in non-Grid farms, are provided, in order to evaluate the interoperability between both environments, and to compare the different performances. The prototypes for local computing infrastructures for data analysis are described. Moreover, information about a local analysis facilities, called Tier-3, is given.

  7. Cardiovascular Events of Electrical Cardioversion Under Optimal Anticoagulation in Atrial Fibrillation: The Multicenter Analysis

    PubMed Central

    Shin, Dong Geum; Cho, Iksung; Hartaigh, Bríain ó; Mun, Hee-Sun; Lee, Hye-Young; Hwang, Eui Seock; Park, Jin-Kyu; Uhm, Jae-Sun; Pak, Hui-Nam; Lee, Moon-Hyoung

    2015-01-01

    Purpose Electric cardioversion has been successfully used in terminating symptomatic atrial fibrillation (AF). Nevertheless, largescale study about the acute cardiovascular events following electrical cardioversion of AF is lacking. This study was performed to evaluate the incidence, risk factors, and clinical consequences of acute cardiovascular events following electrical cardioversion of AF. Materials and Methods The study enrolled 1100 AF patients (mean age 60±11 years) who received cardioversion at four tertiary hospitals. Hospitalizations for stroke/transient ischemic attack, major bleedings, and arrhythmic events during 30 days post electric cardioversion were assessed. Results The mean duration of anticoagulation before cardioversion was 95.8±51.6 days. The mean International Normalized Ratio at the time of cardioversion was 2.4±0.9. The antiarrhythmic drugs at the time of cardioversion were class I (45%), amiodarone (40%), beta-blocker (53%), calcium-channel blocker (21%), and other medication (11%). The success rate of terminating AF via cardioversion was 87% (n=947). Following cardioversion, 5 strokes and 5 major bleedings occurred. The history of stroke/transient ischemic attack (OR 6.23, 95% CI 1.69-22.90) and heart failure (OR 6.40, 95% CI 1.77-23.14) were among predictors of thromboembolic or bleeding events. Eight patients were hospitalized for bradyarrhythmia. These patients were more likely to have had a lower heart rate prior to the procedure (p=0.045). Consequently, 3 of these patients were implanted with a permanent pacemaker. Conclusion Cardioversion appears as a safe procedure with a reasonably acceptable cardiovascular event rate. However, to prevent the cardiovascular events, several risk factors should be considered before cardioversion. PMID:26446636

  8. Antipsychotics-Associated Serious Adverse Events in Children: An Analysis of the FAERS Database

    PubMed Central

    Kimura, Goji; Kadoyama, Kaori; Brown, J.B.; Nakamura, Tsutomu; Miki, Ikuya; Nisiguchi, Kohshi; Sakaeda, Toshiyuki; Okuno, Yasushi

    2015-01-01

    Objective: The reports submitted to the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) from 1997 to 2011 were reviewed to assess serious adverse events induced by the administration of antipsychotics to children. Methods: Following pre-processing of FAERS data by elimination of duplicated records as well as adjustments to standardize drug names, reports involving haloperidol, olanzapine, quetiapine, clozapine, ziprasidone, risperidone, and aripiprazole were analyzed in children (age 0-12). Signals in the data that signified a drug-associated adverse event were detected via quantitative data mining algorithms. The algorithms applied to this study include the empirical Bayes geometric mean, the reporting odds ratio, the proportional reporting ratio, and the information component of a Bayesian confidence propagation neural network. Neuroleptic malignant syndrome (NMS), QT prolongation, leukopenia, and suicide attempt were focused on as serious adverse events. Results: In regard to NMS, the signal scores for haloperidol and aripiprazole were greater than for other antipsychotics. Significant signals of the QT prolongation adverse event were detected only for ziprasidone and risperidone. With respect to leukopenia, the association with clozapine was noteworthy. In the case of suicide attempt, signals for haloperidol, olanzapine, quetiapine, risperidone, and aripiprazole were detected. Conclusions: It was suggested that there is a level of diversity in the strength of the association between various first- and second-generation antipsychotics with associated serious adverse events, which possibly lead to fatal outcomes. We recommend that research be continued in order to gather a large variety and quantity of related information, and that both available and newly reported data be placed in the context of multiple medical viewpoints in order to lead to improved levels of care. PMID:25589889

  9. Multiple event location analysis of aftershock sequences in the Pannonian basin

    NASA Astrophysics Data System (ADS)

    Bekesi, Eszter; Sule, Balint; Bondar, Istvan

    2016-04-01

    Accurate seismic event location is crucial to understand tectonic processes such as crustal faults that are most commonly investigated by studying seismic activity. Location errors can be significantly reduced using multiple event location methods. We applied the double difference method to relocate the earthquake occurred near Oroszlány and its 200 aftershocks to identify the geometry of the related fault. We used the extended ISC location algorithm, iLoc to determine the absolute single event locations for the Oroszlány aftershock sequence and applied double difference algorithm on the new hypocenters. To improve location precision, we added differential times from waveform cross-correlation to the multiple event location process to increase the accuracy of arrival time readings. We also tested the effect of various local 1-D velocity models on the results. We compared hypoDD results of bulletin and iLoc hypocenters to investigate the effect of initial hypocenter parameters on the relocation process. We show that hypoDD collapses the initial, rather diffuse locations into a smaller cluster and the vertical cross-sections show sharp images of seismicity. Unsurprisingly, the combined use of catalog and cross-correlation data sets provides the more accurate locations. Some of the relocated events in the cluster are ground truth quality with a location accuracy of 5 km or better. Having achieved accurate locations for the event cluster we are able to resolve the fault plane ambiguity in the moment tensor solutions and determine the accurate strike of the fault.

  10. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  11. On computer-intensive simulation and estimation methods for rare-event analysis in epidemic models.

    PubMed

    Clémençon, Stéphan; Cousien, Anthony; Felipe, Miraine Dávila; Tran, Viet Chi

    2015-12-10

    This article focuses, in the context of epidemic models, on rare events that may possibly correspond to crisis situations from the perspective of public health. In general, no close analytic form for their occurrence probabilities is available, and crude Monte Carlo procedures fail. We show how recent intensive computer simulation techniques, such as interacting branching particle methods, can be used for estimation purposes, as well as for generating model paths that correspond to realizations of such events. Applications of these simulation-based methods to several epidemic models fitted from real datasets are also considered and discussed thoroughly. PMID:26242476

  12. Impacts of extreme temperature events on mortality: analysis over individual seasons

    NASA Astrophysics Data System (ADS)

    Kysely, J.; Plavcova, E.; Kyncl, J.; Kriz, B.; Pokorna, L.

    2009-04-01

    Extreme temperature events influence human society in many ways, including impacts on morbidity and mortality. While the effects of hot summer periods are relatively direct in mid-latitudinal regions, much less is known and little consensus has been achieved about possible consequences of both positive and negative temperature extremes in other parts of year. The study examines links between spells of hot and cold temperature anomalies and daily all-cause (total) mortality and mortality due to cardiovascular diseases in the population of the Czech Republic (central Europe) in individual seasons (DJF, MAM, JJA, SON). The datasets cover the period 1986-2006. Hot (cold) spells are defined in terms of anomalies of average daily temperature from the mean annual cycle as periods of at least 2 successive days on which the anomalies are above (below) the 95% (5%) quantile of the empirical distribution of the anomalies. Excess daily mortality is established by calculating deviations of the observed number of deaths and the expected number of deaths, which takes into account effects of long-term changes in mortality and the annual cycle. Periods when mortality is affected by influenza and acute respiratory infection outbreaks have been identified and excluded from the datasets before the analysis. The study is carried out for several population groups in order to identify dependence of the mortality impacts on age and gender; in particular, we focus on differences in the impacts on the elderly (70+ yrs) and younger age groups (0-69 yrs). Although results for hot- and cold-related mortality are less conclusive in the other seasons outside summer, significant links are found in several cases. The analysis reveals that - the largest effects of either hot or cold spells are observed for hot spells in JJA, with a 14% (16%) increase in mortality for the 1-day lag for all ages (70+ yrs); - much smaller but still significant effects are associated with hot spells in MAM; - the

  13. Analysis of postulated events for the revised ALMR/PRISM design

    SciTech Connect

    Slovik, G.C.; Van Tuyle, G.J.

    1991-12-31

    The Nuclear Regulatory Commission (NRC) is continuing a pre-application review of the 471 MWt, Liquid Metal Reactor, PRISM by General Electric, with Brookhaven National Laboratory providing technical support. The revised design has been evaluated, using the SSC code, for an unscrammed loss of heat sink (ULOHS), an unscrammed loss of flow (ULOF) with and without the Gas Expansion Modules (GEMs), and a 40{cents} unscrammed transient overpower (UTOP) event. The feedback effects for U-27Pu-10Zr metal fuel were modeled in SSC. The ULOHS accident was determined to be a benign event for the design, with the reactor power transitioning down to a decay heat level within 500s. The power during the postulated ULOF events, with the GEMs functioning, transitioned to decay heat levels without fuel damage, and included a 300K margin to sodium saturation. The case without the GEMs had only a 160K margin to sodium saturation and higher fuel temperatures. In addition, the clad was predicted to quickly pass through the eutectic phase (between fuel and clad), and some clad wastage would result. The 40{cents} UTOP was predicted to raise the power to 1.8 rated, which later stabilized near 1.2 times full power. SSC predicted some localized fuel melting for the event, but the significance of this localized damage has not yet been determined. If necessary, the vendor has options to reduce the maximum reactivity insertion below 40{cents}.

  14. Analysis of postulated events for the revised ALMR/PRISM design

    SciTech Connect

    Slovik, G.C.; Van Tuyle, G.J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) is continuing a pre-application review of the 471 MWt, Liquid Metal Reactor, PRISM by General Electric, with Brookhaven National Laboratory providing technical support. The revised design has been evaluated, using the SSC code, for an unscrammed loss of heat sink (ULOHS), an unscrammed loss of flow (ULOF) with and without the Gas Expansion Modules (GEMs), and a 40{cents} unscrammed transient overpower (UTOP) event. The feedback effects for U-27Pu-10Zr metal fuel were modeled in SSC. The ULOHS accident was determined to be a benign event for the design, with the reactor power transitioning down to a decay heat level within 500s. The power during the postulated ULOF events, with the GEMs functioning, transitioned to decay heat levels without fuel damage, and included a 300K margin to sodium saturation. The case without the GEMs had only a 160K margin to sodium saturation and higher fuel temperatures. In addition, the clad was predicted to quickly pass through the eutectic phase (between fuel and clad), and some clad wastage would result. The 40{cents} UTOP was predicted to raise the power to 1.8 rated, which later stabilized near 1.2 times full power. SSC predicted some localized fuel melting for the event, but the significance of this localized damage has not yet been determined. If necessary, the vendor has options to reduce the maximum reactivity insertion below 40{cents}.

  15. Transition Region Explosive Events in He II 304Å: Observation and Analysis

    NASA Astrophysics Data System (ADS)

    Rust, Thomas; Kankelborg, Charles C.

    2016-05-01

    We present examples of transition region explosive events observed in the He II 304Å spectral line with the Multi Order Solar EUV Spectrograph (MOSES). With small (<5000 km) spatial scale and large non-thermal (100-150 km/s) velocities these events satisfy the observational signatures of transition region explosive events. Derived line profiles show distinct blue and red velocity components with very little broadening of either component. We observe little to no emission from low velocity plasma, making the plasmoid instability reconnection model unlikely as the plasma acceleration mechanism for these events. Rather, the single speed, bi-directional jet characteristics suggested by these data are consistent with acceleration via Petschek reconnection.Observations were made during the first sounding rocket flight of MOSES in 2006. MOSES forms images in 3 orders of a concave diffraction grating. Multilayer coatings largely restrict the passband to the He II 303.8Å and Si XI 303.3Å spectral lines. The angular field of view is about 8.5'x17', or about 20% of the solar disk. These images constitute projections of the volume I(x,y,λ), the intensity as a function of sky plane position and wavelength. Spectral line profiles are recovered via tomographic inversion of these projections. Inversion is carried out using a multiplicative algebraic reconstruction technique.

  16. Intravenous immune globulin and thromboembolic adverse events: A systematic review and meta-analysis of RCTs.

    PubMed

    Ammann, Eric M; Haskins, Cole B; Fillman, Kelsey M; Ritter, Rebecca L; Gu, Xiaomei; Winiecki, Scott K; Carnahan, Ryan M; Torner, James C; Fireman, Bruce H; Jones, Michael P; Chrischilles, Elizabeth A

    2016-06-01

    Prior case reports and observational studies indicate that intravenous immune globulin (IVIg) products may cause thromboembolic events (TEEs), leading the FDA to require a boxed warning in 2013. The effect of IVIg treatment on the risk of serious TEEs (acute myocardial infarction, ischemic stroke, or venous thromboembolism) was assessed using adverse event data reported in randomized controlled trials (RCTs) of IVIg. RCTs of IVIg in adult patients from 1995 to 2015 were identified from Pubmed, Embase, ClinicalTrials.Gov, and two large prior reviews of IVIg's therapeutic applications. Trials at high risk of detection or reporting bias for serious adverse events were excluded. 31 RCTs with a total of 4,129 participants (2,318 IVIg-treated, 1,811 control) were eligible for quantitative synthesis. No evidence was found of increased TEE risk among IVIg-treated patients compared with control patients (odds ratio = 1.10, 95% CI: 0.44, 2.88; risk difference = 0.0%, 95% CI: -0.7%, 0.7%, I(2)  = 0%). No significant increase in risk was found when arterial and venous TEEs were analyzed as separate endpoints. Trial publications provided little specific information concerning the methods used to ascertain potential adverse events. Care should be taken in extrapolating the results to patients with higher baseline risks of TEE. Am. J. Hematol. 91:594-605, 2016. © 2016 Wiley Periodicals, Inc. PMID:26973084

  17. An analysis of extreme intraseasonal rainfall events during January-March 2010 over eastern China

    NASA Astrophysics Data System (ADS)

    Yao, Suxiang; Huang, Qian

    2016-09-01

    The precipitation over eastern China during January-March 2010 exhibited a marked intraseasonal oscillation (ISO) and a dominant period of 10-60 days. There were two active intraseasonal rainfall periods. The physical mechanisms responsible for the onset of the two rainfall events were investigated using ERA-interim data. In the first ISO event, anomalous ascending motion was triggered by vertically integrated (1000-300 hPa) warm temperature advection. In addition to southerly anomalies on the intraseasonal (10-60-day) timescale, synoptic-scale southeasterly winds helped advect warm air from the South China Sea and western Pacific into the rainfall region. In the second ISO event, anomalous convection was triggered by a convectively unstable stratification, which was caused primarily by anomalous moisture advection in the lower troposphere (1000-850 hPa) from the Bay of Bengal and the Indo-China Peninsula. Both the intraseasonal and the synoptic winds contributed to the anomalous moisture advection. Therefore, the winter intraseasonal rainfall events over East Asia in winter could be affected not only by intraseasonal activities but also by higher frequency disturbances.

  18. Further Analysis of Variables That Affect Self-Control with Aversive Events

    ERIC Educational Resources Information Center

    Perrin, Christopher J.; Neef, Nancy A.

    2012-01-01

    The purpose of this study was to examine variables that affect self-control in the context of academic task completion by elementary school children with autism. In the baseline assessment of Study 1, mathematics problem completion was shown to be an aversive event, and sensitivity to task magnitude, task difficulty, and delay to task completion…

  19. Descriptive Analysis of Classroom Setting Events on the Social Behaviors of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond

    2008-01-01

    Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting events) in the classroom environments of children with ASD that promoted their engagement in peer-related social…

  20. Meta-Analysis of Suicide-Related Behavior Events in Patients Treated with Atomoxetine

    ERIC Educational Resources Information Center

    Bangs, Mark E.; Tauscher-Wisniewski, Sitra; Polzer, John; Zhang, Shuyu; Acharya, Nayan; Desaiah, Durisala; Trzepacz, Paula T.; Allen, Albert J.

    2008-01-01

    A study to examine suicide-related events in acute, double-blind, and placebo controlled trials with atomoxetine is conducted. Results conclude that the incidences of suicide were more frequent in children suffering from ADHD treated with atomoxetine as compared to those treated with placebo.

  1. Parental Separation and Child Aggressive and Internalizing Behavior: An Event History Calendar Analysis

    ERIC Educational Resources Information Center

    Averdijk, Margit; Malti, Tina; Eisner, Manuel; Ribeaud, Denis

    2012-01-01

    This study investigated the relationship between parental separation and aggressive and internalizing behavior in a large sample of Swiss children drawn from the ongoing Zurich Project on the Social Development of Children and Youths. Parents retrospectively reported life events and problem behavior for the first 7 years of the child's life on a…

  2. Constructions of High Academic Achievement through the Analysis of Critical Events.

    ERIC Educational Resources Information Center

    Koro-Ljungberg, Mirka

    2002-01-01

    In this study, critical events in the career development of 26 Finnish Academy professors describe the trajectories of their achievement and success. Primary themes included mentors, academic years abroad, strong interest in the domain, and luck. Also described are transformative themes. Personal stories illustrate the individualistic nature of…

  3. The Successful Resolution of Armed Hostage/Barricade Events in Schools: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Daniels, Jeffrey A.; Bradley, Mary C.; Cramer, Daniel P.; Winkler, Amy J.; Kinebrew, Kisha; Crockett, Deleska

    2007-01-01

    This article explores the perceptions and reactions of school and law enforcement personnel in the successful resolution of armed hostage and barricade events in schools. A total of 12 individuals from three schools were interviewed to determine (1) their salient roles related to the situations, (2) facilitative systemic conditions, (3) to what…

  4. An Analysis of Personal Event Narratives Produced by School-Age Children.

    ERIC Educational Resources Information Center

    Crow, Kristina M.; Ward-Lonergan, Jeannene M.

    This study compared and analyzed the language capabilities of 10 school-age children raised in either single parent homes resulting from divorce or in two parent families. More specifically, it compared the context and complexity of oral personal event narratives produced by both groups of children. The study also investigated the usefulness and…

  5. Medication errors: an analysis comparing PHICO's closed claims data and PHICO's Event Reporting Trending System (PERTS).

    PubMed

    Benjamin, David M; Pendrak, Robert F

    2003-07-01

    Clinical pharmacologists are all dedicated to improving the use of medications and decreasing medication errors and adverse drug reactions. However, quality improvement requires that some significant parameters of quality be categorized, measured, and tracked to provide benchmarks to which future data (performance) can be compared. One of the best ways to accumulate data on medication errors and adverse drug reactions is to look at medical malpractice data compiled by the insurance industry. Using data from PHICO insurance company, PHICO's Closed Claims Data, and PHICO's Event Reporting Trending System (PERTS), this article examines the significance and trends of the claims and events reported between 1996 and 1998. Those who misread history are doomed to repeat the mistakes of the past. From a quality improvement perspective, the categorization of the claims and events is useful for reengineering integrated medication delivery, particularly in a hospital setting, and for redesigning drug administration protocols on low therapeutic index medications and "high-risk" drugs. Demonstrable evidence of quality improvement is being required by state laws and by accreditation agencies. The state of Florida requires that quality improvement data be posted quarterly on the Web sites of the health care facilities. Other states have followed suit. The insurance industry is concerned with costs, and medication errors cost money. Even excluding costs of litigation, an adverse drug reaction may cost up to $2500 in hospital resources, and a preventable medication error may cost almost $4700. To monitor costs and assess risk, insurance companies want to know what errors are made and where the system has broken down, permitting the error to occur. Recording and evaluating reliable data on adverse drug events is the first step in improving the quality of pharmacotherapy and increasing patient safety. Cost savings and quality improvement evolve on parallel paths. The PHICO data

  6. Plasma properties from the multi-wavelength analysis of the November 1st 2003 CME/shock event

    PubMed Central

    Benna, Carlo; Mancuso, Salvatore; Giordano, Silvio; Gioannini, Lorenzo

    2012-01-01

    The analysis of the spectral properties and dynamic evolution of a CME/shock event observed on November 1st 2003 in white-light by the LASCO coronagraph and in the ultraviolet by the UVCS instrument operating aboard SOHO, has been performed to compute the properties of some important plasma parameters in the middle corona below about 2R⊙. Simultaneous observations obtained with the MLSO/Mk4 white-light coronagraph, providing both the early evolution of the CME expansion in the corona and the pre-shock electron density profile along the CME front, were also used to study this event. By combining the above information with the analysis of the metric type II radio emission detected by ground-based radio spectrographs, we finally derive estimates of the values of the local Alfvén speed and magnetic field strength in the solar corona. PMID:25685432

  7. Geohazard assessment through the analysis of historical alluvial events in Southern Italy

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo

    2015-04-01

    The risk associated with extreme water events such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood events over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow events have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm events typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic events repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.

  8. Analysis of Extreme Events in Regional Climate Model Simulations for the Pacific Northwest using weatherathome

    NASA Astrophysics Data System (ADS)

    Mera, R. J.; Mote, P.; Weber, J.

    2011-12-01

    One of the most prominent impacts of climate change over the Pacific Northwest is the potential for an elevated number of extreme precipitation events over the region. Planning for natural hazards such as increasing number of floods related to high-precipitation events have, in general, focused on avoiding development in floodplains and conditioning development to withstand inundation with a minimum of losses. Nationwide, the Federal Emergency Management Agency (FEMA) estimates that about one quarter of its payments cover damage that has occurred outside mapped floodplains. It is clear that traditional flood-based planning will not be sufficient to predict and avoid future losses resulting from climate-related hazards such as high-precipitation events. In order to address this problem, the present study employs regional climate model output for future climate change scenarios to aid with the development of a map-based inventory of future hazard risks that can contribute to the development of a "planning-scale" decision support system for the Oregon Department of Land Conservation and Development (DLCD). Climate model output is derived from the climateprediction.net (CPDN) weatherathome project, an innovative climate science experiment that utilizes volunteer computers from users worldwide to produce hundreds of thousands superensembles of regional climate simulations of the Western United States climate from 1950 to 2050. The spatial and temporal distribution of extreme weather events are analyzed for the Pacific Northwest to diagnose the model's capabilities as an input for map products such as impacts on hydrology. Special attention is given to intensity and frequency of Atmospheric River events in historical and future climate contexts.

  9. Development and application of a multi-targeting reference plasmid as calibrator for analysis of five genetically modified soybean events.

    PubMed

    Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao

    2015-04-01

    Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events. PMID:25673245

  10. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  11. The Use of Qualitative Comparative Analysis for Critical Event Research in Alcohol and HIV in Mumbai, India

    PubMed Central

    Chandran, Devyani; Singh, S. K.; Berg, Marlene; Singh, Sharad; Gupta, Kamla

    2010-01-01

    In this paper we use Qualitative Comparative Analysis (QCA) in critical event analysis to identify under what conditions alcohol is necessary in contributing to unprotected sex. The paper is based on a set of in-depth interviews with 84 men aged 18 = 29 from three typical low income communities in Mumbai who reported using alcohol and having sex with at least one nonspousal partner once or more in the 30 days prior to the interview. The interviews included narratives of critical events defined as recent (past 30–60 day) events involving sexual behavior with or without alcohol. The paper identifies themes related to alcohol, sexuality and condom use, uses QCA to identify and explain configurations leading to protected and unprotected sex, and explains the differences. The analysis shows that alcohol alone is not sufficient to explain any cases involving unprotected sex but alcohol in combination with partner type and contextual factors does explain unprotected sex for subsets of married and unmarried men. PMID:20563636

  12. Slow slip events in the Mexican subduction zone (Guerrero-Oaxaca) observed by the analysis of continuous GPS

    NASA Astrophysics Data System (ADS)

    Cotte, N.; Walpersdorf, A.; Kostoglodov, V.; Vergnolle, M.; Santiago, J.

    2007-05-01

    Aseismic slow slip events (SSE) have been observed in the last decade with GPS, in particular in the Mexican subduction zone, Guerrero-Oaxaca. Thanks to the effort of the UNAM (Universidad Nacional Autonoma de Mexico), 21 permanent GPS stations are currently operating within or close to this zone. We present the results of an analysis of the available continuous GPS data that we have conducted using the Gamit/Globk softwares developped by MIT (Massachusetts Institute of Technology) and SIO (Scripps Institution of Oceanography). The analysis includes up to date atmospheric loading models and mapping functions, what helps us to gain more precision in the positioning results. We have analyzed the complete time series running from 1997 (first GPS station installed in Cayaco) up to present. One of our major objectives is to study the time evolution of the slow slip events and to determine the amplitude and the direction of the slips. A first major result from our analysis is that we do not find in the time series any clear evidence of a number of the smallest transient slip events (i.e., total displacements of 1-2 cm at the coast) that had been reported before in Guerrero-Oaxaca and we suggest that some of them may actually be artefacts.

  13. Modeling and analysis of early events in T-lymphocyte antigen-activated intracellular-signaling pathways

    NASA Astrophysics Data System (ADS)

    Zheng, Yanan; Balakrishnan, Venkataramanan; Buzzard, Greg; Geahlen, Robert; Harrison, Marietta; Rundell, Ann

    2005-12-01

    The T-cell antigen-activated signaling pathway is a highly regulated intracellular biochemical system that is crucial for initiating an appropriate adaptive immune response. To improve the understanding of the complex regulatory mechanisms controlling the early events in T-cell signaling, a detailed mathematical model was developed that utilizes ordinary differential equations to describe chemical reactions of the signaling pathway. The model parameter values were constrained by experimental data on the activation of a specific signaling intermediate and indicated an initial rapid cascade of phosphorylation events followed by a comparatively slow signal downregulation. Nonlinear analysis of the model suggested that thresholding and bistability occur as a result of the embedded positive and negative feedback loops within the model. These nonlinear system properties may enhance the T-cell receptor specificity and provide sub-threshold noise filtering with switch-like behavior to ensure proper cell response. Additional analysis using a reduced second-order model led to further understanding of the observed system behavior. Moreover, the interactions between the positive and negative feedback loops enabled the model to exhibit, among a variety of other feasible dynamics, a sustained oscillation that corresponds to a stable limit cycle in the two-dimensional phase plane. Quantitative analysis in this paper has helped identify potential regulatory mechanisms in the early T-cell signaling events. This integrated approach provides a framework to quantify and discover the ensemble of interconnected T-cell antigen-activated signaling pathways from limited experimental data.

  14. Analysis of Microphysics Mechanisms in Icing Aircraft Events: A Case Study

    NASA Astrophysics Data System (ADS)

    Sanchez, Jose Luis; Fernández, Sergio; Gascón, Estibaliz; Weigand, Roberto; Hermida, Lucia; Lopez, Laura; García-Ortega, Eduardo

    2013-04-01

    The appearance of Supercooled Large Drops (SLD) can give way to icing aircraft. In these cases, atmospheric icing causes an unusual loss of support on the aircraft due to the rapid accumulation of ice on the wings or measurement instruments. There are two possible ways that SLD can be formed: The first is through a process called "warm nose", followed by "resupercooling". This process is usually associated with the entrance of warm fronts. The second possibility is that drops are formed by the process of condensation, and they grow, to sizes of at least 50 µm through processes of collision-coalescence, in environments with temperatures inferior to 0°C at all times, but without being able to produce a freezing process. Some authors point out that approximately 75% of gelling precipitation events are produced as a consequence of this second situation. Within the framework of the TECOAGUA Project, a series of scientific flights were performed in order to collect data in cloud systems capable of producing precipitation during the winter period and their capacity to create environments favorable to "icing aircraft". These flights were carried out making use of a C 212-200 aircraft, belonging to the National Institute of Aerospatial Techniques (INTA), with a CAPS installed. On 1 February 2012, the C 212-200 aircraft took off from the airport in Torrejón de Ardoz (Madrid), flying about 70 km to stand upright on the northern side of the Central System, finding itself at a flight level of 3500 m, an elevated concentration of SLD at temperatures around -12°C, with liquid water content up to 0.44 g/m3, which provoked the accumulation of ice on the outline of the aircraft's wings, which required a cancellation of the flight. Surrounding the flight area, a microwave radiometer (MWR) was installed. An area of instability between 750 hPa and 600 hPa was identified in the vertical MWR profiles of temperature and humidity during the hour of the flight. It is mainly in this

  15. Final Report for Dynamic Models for Causal Analysis of Panel Data. Alternative Estimation Procedures for Event-History Analysis: A Monte Carlo Study. Part III, Chapter 5.

    ERIC Educational Resources Information Center

    Carroll, Glenn R.; And Others

    This document is part of a series of chapters described in SO 011 759. The chapter examines the merits of four estimators in the causal analysis of event-histories (data giving the number, timing, and sequence of changes in a categorical dependent variable). The four procedures are ordinary least squares, Kaplan-Meier least squares, maximum…

  16. From event analysis to global lessons: disaster forensics for building resilience

    NASA Astrophysics Data System (ADS)

    Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard

    2016-07-01

    With unprecedented growth in disaster risk, there is an urgent need for enhanced learning and understanding of disasters, particularly in relation to the trends in drivers of increasing risk. Building on the disaster forensics field, we introduce the post-event review capability (PERC) methodology for systematically and holistically analysing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalisable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilise the freely available PERC approach and contribute to building a repository of learning on disaster risk management and resilience.

  17. Analysis of radiation risk from alpha particle component of solar particle events

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Townsend, L. W.; Wilson, J. W.; Golightly, M. J.; Weyland, M.

    1994-01-01

    The solar particle events (SPE) will contain a primary alpha particle component, representing a possible increase in the potential risk to astronauts during an SPE over the often studied proton component. We discuss the physical interactions of alpha particles important in describing the transport of these particles through spacecraft and body shielding. Models of light ion reactions are presented and their effects on energy and linear energy transfer (LET) spectra in shielding discussed. We present predictions of particle spectra, dose, and dose equivalent in organs of interest for SPE spectra typical of those occurring in recent solar cycles. The large events of solar cycle 19 are found to have substantial increase in biological risk from alpha particles, including a large increase in secondary neutron production from alpha particle breakup.

  18. ANALYSIS ON RECENT FLOOD EVENTS AND TREE VEGETATION COLLAPSES IN KAKO RIVER

    NASA Astrophysics Data System (ADS)

    Michioku, Kohji; Miyamoto, Hitoshi; Kanda, Keiichi; Ohchi, Yohei; Aga, Kazuho; Morioka, Jyunji; Uotani, Takuya; Yoshida, Kazuaki; Yoshimura, Satoshi

    Forestation on flood plains is a world-wide engineering issue in middle to downstream reaches in many rivers. This brings not only degradation of flow conveyance capacity but also irreversible changes of ecological system in rivers. In order to obtain information on tree vegetation behavior during flood events, field data of flow fields and tree vegetation collapse were collected in Kako River, where willows are heavily vegetated on the flood plain. After starting a H-ADCP flow measurement in 2009, small to medium size flood events frequently occurred, which enables us not only to verify an analytical model to reproduce flow fields in and out of vegetations but also to examine tree vegetation collapses after flooding. The analytical solutions on velocity profiles as well as flow force acting on trees were in good agreement with the H-ADCP measurements and tree damages, respectively.

  19. Analysis of radiation risk from alpha particle component of soalr particle events

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Townsend, L. W.; Wilson, J. W.; Golightly, M. J.; Weyland, M.

    1994-01-01

    The Solar Particle Events (SPE) will contain a primary alpha particle component, representing a possible increase in the potential risk to astronauts during an SPE over the often studied proton component. We discuss the physical interactions of alpha particles important in describing the transport of these particles through spacecraft and body shielding. Models of light ion reactions are presented and their effects on energy and Linear Energy Transfer (LET) spectra in shielding are discussed. We present predictions of particle spectra, dose, and dose equivalent in organs of interest for SPE spectra typical of those occurring in recent solar cycles. The large events of solar cycle 19 are found to have substantial increase in biological risk from alpha particles, including a large increase in secondary neutron production from alpha particle breakup.

  20. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  1. a Database for On-Line Event Analysis on a Distributed Memory Machine

    NASA Astrophysics Data System (ADS)

    Argante, E.; Meesters, M. R. J.; van der Stok, P.; Willers, I.

    Parallel in-memory databases can enhance the structuring and parallelization of programs used in High Energy Physics (HBP). Efficient database access routines are used as communication primitives which hide the communication topology in contrast to the more explicit communications like PVM or MPI. A parallel in-memory database, called SPIDER, has been implemented on a 32 node Meiko CS-2 distributed memory machine. The SPIDER primitives generate a lower overhead than the one generated by PVM or MPI. The event reconstruction program, CPREAD, of the CPLEAR experiment, has been used as a test case. Performance measurements showed that CPREAD interfaced to SPIDER can easily cope with the event rate generated by CPLEAR.

  2. Analysis of grain boundary dynamics using event detection and cumulative averaging.

    PubMed

    Gautam, A; Ophus, C; Lançon, F; Denes, P; Dahmen, U

    2015-04-01

    To analyze extended time series of high resolution images, we have employed automated frame-by-frame comparisons that are able to detect dynamic changes in the structure of a grain boundary in Au. Using cumulative averaging of images between events allowed high resolution measurements of the atomic relaxation in the interface with sufficient accuracy for comparison with atomistic models. Cumulative averaging was also used to observe the structural rearrangement of atomic columns at a moving step in the grain boundary. The technique of analyzing changing features in high resolution images by averaging between incidents can be used to deconvolute stochastic events that occur at random intervals and on time scales well beyond that accessible to single-shot imaging. PMID:25498139

  3. FURTHER ANALYSIS OF VARIABLES THAT AFFECT SELF-CONTROL WITH AVERSIVE EVENTS

    PubMed Central

    Perrin, Christopher J; Neef, Nancy A

    2012-01-01

    The purpose of this study was to examine variables that affect self-control in the context of academic task completion by elementary school children with autism. In the baseline assessment of Study 1, mathematics problem completion was shown to be an aversive event, and sensitivity to task magnitude, task difficulty, and delay to task completion were measured. The effects of manipulating values of those parameters on self-control then were assessed. For all participants, self-control increased as a function of one or more changes in task parameter values. In Study 2, the effects of a commitment response on self-control was assessed. Results indicated that for all participants, levels of self-control were higher when the opportunity to commit to the immediate aversive event was available. PMID:22844138

  4. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  5. Analysis of a flood event on a karst river by means of a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Martina, Mario L. V.; de Waele, Jo; Sanna, Laura; Cabras, Salvatore; Antonello Cossu, Q.

    2010-05-01

    Fluviokarstic catchments are difficult to model especially during a flood event because of their hydrological complexity. The hydrological characterization is even more challenging for ungauged rivers. In the last five winters (2004-2008) several exceptional meteorological events producing flash floods have been registered in Central-East Sardinia on ungauged or poorly gauged catchments. We present here an approach to estimate the peak discharge taking into account the karst component. Peak discharge has been estimated based on a distributed hydrological model and on empirical methods that consider geomorphic and sedimentological observations. The comparison between the results derived from these independent methods allows to obtain the best possible estimate of peak discharge. Differences between modelled and measured peak flows can be attributed to water losses and/or gains along the river channel due to interactions with the underground karst drainage network. An application on a catchment in Central-East Sardinia is discussed.

  6. Max-plus Algebraic Tools for Discrete Event Systems, Static Analysis, and Zero-Sum Games

    NASA Astrophysics Data System (ADS)

    Gaubert, Stéphane

    The max-plus algebraic approach of timed discrete event systems emerged in the eighties, after the discovery that synchronization phenomena can be modeled in a linear way in the max-plus setting. This led to a number of results, like the determination of long term characteristics (throughput, stationary regime) by spectral theory methods or the representation of the input-output behavior by rational series.

  7. On the identification of piston slap events in internal combustion engines using tribodynamic analysis

    NASA Astrophysics Data System (ADS)

    Dolatabadi, N.; Theodossiades, S.; Rothberg, S. J.

    2015-06-01

    Piston slap is a major source of vibration and noise in internal combustion engines. Therefore, better understanding of the conditions favouring piston slap can be beneficial for the reduction of engine Noise, Vibration and Harshness (NVH). Past research has attempted to determine the exact position of piston slap events during the engine cycle and correlate them to the engine block vibration response. Validated numerical/analytical models of the piston assembly can be very useful towards this aim, since extracting the relevant information from experimental measurements can be a tedious and complicated process. In the present work, a coupled simulation of piston dynamics and engine tribology (tribodynamics) has been performed using quasi-static and transient numerical codes. Thus, the inertia and reaction forces developed in the piston are calculated. The occurrence of piston slap events in the engine cycle is monitored by introducing six alternative concepts: (i) the quasi-static lateral force, (ii) the transient lateral force, (iii) the minimum film thickness occurrence, (iv) the maximum energy transfer, (v) the lubricant squeeze velocity and (vi) the piston-impact angular duration. The validation of the proposed methods is achieved using experimental measurements taken from a single cylinder petrol engine in laboratory conditions. The surface acceleration of the engine block is measured at the thrust- and anti-thrust side locations. The correlation between the theoretically predicted events and the measured acceleration signals has been satisfactory in determining piston slap incidents, using the aforementioned concepts. The results also exhibit good repeatability throughout the set of measurements obtained in terms of the number of events occurring and their locations during the engine cycle.

  8. An event-related analysis of awakening reactions due to nocturnal church bell noise.

    PubMed

    Brink, Mark; Omlin, Sarah; Müller, Christian; Pieren, Reto; Basner, Mathias

    2011-11-15

    The sleep disturbing effects of nocturnal ambient non-traffic related noises such as bell strokes emitted from church bell towers on nearby residents are presently unknown. Nonetheless, this specific noise source is suspected to cause sleep disturbances in a small but qualified minority of people living in the vicinity of the bell towers that throughout the night indicate the time with bell ringings. A field study was carried out to elucidate whether acoustic properties of such bell strokes relate to awakening and to provide event-related exposure-effect functions between acoustical predictors and awakening probability. Awakening reactions were determined in 27 voluntary subjects, measured in their home setting for four consecutive nights with ambulatory polysomnography (PSG) and concurrent acoustic recordings in- and outside the dwelling. Results indicate that the bell ringing events increase awakenings in a similar fashion as has previously been reported with transportation noise events and that awakening probability first and foremost depends on maximum sound pressure level of an event. The number of bell strokes and the personal variables gender, age, and noise sensitivity did not influence awakening probability significantly. Awakening probability by tendency increased with elapsed time after sleep onset, and was decreased during slow wave sleep and REM sleep compared to S2 sleep. The results suggest that a reduction of the maximum sound pressure level or an interruption of ringings during nighttime might reduce awakenings. The determined exposure-effect relationships are compared with similar functions for impulsive noise and transportation noise, more specifically, aircraft noise. The paper concludes with a few considerations regarding nighttime noise regulation. PMID:21978615

  9. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique. [CETAT computer program

    SciTech Connect

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program.

  10. Analysis of Severe Weather Events by Integration of Civil Protection Operation Data

    NASA Astrophysics Data System (ADS)

    Heisterkamp, Tobias; Kox, Thomas

    2015-04-01

    In Germany, winter storms belong to those natural hazards responsible for the largest damages (GDV 2014). This is a huge challenge for the civil protection, especially in metropolitan areas like Berlin. Nowadays, large-scale storm events are generally well predictable, but detailed forecasts on urban district or even street level are still out of range. Fire brigades, as major stakeholder covering severe weather consequences, operate on this small scale and in the whole area due to their juris-diction. For forensic investigation of disasters this presentation offers an additional approach by using the documentation of fire brigade operations as a new data source. Hazard dimensions and conse-quences of severe weather events are reconstructed via GIS-based analyses of these operations. Local case studies of recent storms are used as a comparison and as an additional information to three WMO weather stations in Berlin. Thus, hot spots of these selected events can be identified by operation site accumulations. Further indicators for Berlin are added to detect aspects that de-termine vulnerabilities. The conclusion discusses the potential of this approach as well as possible benefits of integration into warning systems.

  11. Analysis of inter-event times for avalanches on a conical bead pile with cohesion

    NASA Astrophysics Data System (ADS)

    Lehman, Susan; Johnson, Nathan; Tieman, Catherine; Wainwright, Elliot

    2015-03-01

    We investigate the critical behavior of a 3D conical bead pile built from uniform 3 mm steel spheres. Beads are added to the pile by dropping them onto the apex one at a time; avalanches are measured through changes in pile mass. We investigate the dynamic response of the pile by recording avalanches from the pile over tens of thousands of bead drops. We have previously shown that the avalanche size distribution follows a power law for beads dropped onto the pile apex from a low drop height. We are now tuning the critical behavior of the system by adding cohesion from a uniform magnetic field and find an increase in both size and number for very large avalanches and decreases in the mid-size avalanches. The resulting bump in the avalanche distribution moves to larger avalanche size as the cohesion in the system is increased. We compare the experimental inter-event time distribution to both the Brownian passage-time and Weibull distributions, and observe a shift from the Weibull to Brownian passage-time as we raise the threshold from measuring time between events of all sizes to time between only the largest system-spanning events. These results are both consistent with those from a mean-field model of slip avalanches in a shear system [Dahmen, Nat Phys 7, 554 (2011)].

  12. Solar flare protection for manned lunar missions - Analysis of the October 1989 proton flare event

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Sauer, Herbert H.

    1991-01-01

    Several large solar proton events occurred in the latter half of 1989. For a moderately shielded spacecraft in free space, the potential exposure would have been greatest for the flare which occurred between October 19 to 27, 1989. The temporal variations of the proton energy spectra at approximately 1 AU were monitored by the GOES-7 satellite. These data, recorded and processed at the NOAA-Boulder Space Environment Laboratory, provide the opportunity to analyze dose rates and cumulative doses which might be incurred by astronaus in transit to, or on, the moon. Of particular importance in such an event is the time development of exposure in the early phases of the flare, for which dose rates may range over many orders of magnitude in the first few hours. The cumulative dose as a function of time for the entire event is also predicted. In addition to basic shield calculations, dose rate contours are constructed for flare shelters in free-space and on the lunar surface.

  13. Transient analysis of a flywheel battery containment during a full rotor burst event.

    SciTech Connect

    Hsieh, B. J.

    1998-04-17

    Flywheels are being developed for use in an Advanced Locomotive Propulsion System (ALPS) targeted for use in high speed passenger rail service. The ALPS combines high performance, high speed gas turbines, motor/generators and flywheels to provide a light-weight, fuel-efficient power system. Such a system is necessary to avoid the high cost of railway electrification, as is currently done for high speed rail service (>100mph) since diesels are too heavy. The light-weight flywheel rotors are made from multilayered composite materials, and are operated at extremely high energy levels. Metal containment structures have been designed to enclose the rotors and provide encapsulation of the rotor during postulated failure events. One such event is a burst mode failure of the rotor in which the composite rim is assumed to burst into debris that impacts against the containment. This paper presents a finite element simulation of the transient structural response of a subscale metal flywheel containment structure to a rotor burst event.

  14. A Spectral Analysis of a Rare "Dwarf Eat Dwarf" Cannibalism Event

    NASA Astrophysics Data System (ADS)

    Theakanath, Kuriakose; Toloba, E.; Guhathakurta, P.; Romanowsky, A. J.; Ramachandran, N.; Arnold, J.

    2014-01-01

    We have used Keck/DEIMOS to conduct the first detailed spectroscopic study of the recently discovered stellar stream in the Large Magellanic Cloud analog NGC 4449. Martinez-Delgado et al. (2012), using the tip of the red giant branch (TRGB), found that both objects, the stream and NGC 4449, are at the same distance, which suggests that this stream is the remnant of the first ongoing dwarf-dwarf cannibalism event known so far. Learning about the orbital properties of this event is a powerful tool to constrain the physical conditions involved in dwarf-dwarf merger events. The low surface-brightness of this structure makes impossible to obtain integrated light spectroscopic measurements, and its distance (3.8 Mpc) is too large as to observe stars individually. In the color-magnitude diagram of the stellar stream there is an excess of objects brighter than the TRGB which are potential star blends. We designed our DEIMOS mask to contain as many of these objects as possible and, while some of them turned out to be background galaxies, a handful happened to be star blends in the stream. Our velocity measurements along the stream prove that it is gravitationally bound to NGC 4449 and put strong constraints on the orbital properties of the infall. This research was carried out under the auspices of UCSC's Science Internship Program. We thank the National Science Foundation for funding support. ET was supported by a Fulbright fellowship.

  15. Lag-Correlation analysis of the April 2013 flood event in Argentina

    NASA Astrophysics Data System (ADS)

    Alvizurez, J. J.; Kraatz, S. G.; Tesfagiorgis, K. B.

    2013-12-01

    Floods are one of the major causes for damages and loss of life around the world. In the United States alone the average of fatalities from 1977 to 2006 is 99 people per year. The ability to globally monitor flood events as they unfold, makes it possible to assess their impacts more accurately, even if the floods occur in remote regions. Our case study is a flood event in Argentina, which occurred during April 2013. It caused about 530.4 million pesos ($104 million) in damages and at least 51 deaths. The severity of flood events mainly depends on precipitation and soil moisture. Scientists at NOAA-CREST have recently developed a daily global flood observation system in an effort to monitor global floods accurately. The developed flood observation system is based on the concept of Soil Wetness Variation Index (SWVI). The lag correlation between precipitation and SWVI is studied. Precipitation data was obtained from Tropical Rainfall Measuring Mission (TRMM) 3B42v7, TRMM-adjusted merged-infrared(IR) precipitation data set. The SWVIs data used for flooding observation are calculated from data (Brightness Temperature) collected by Advanced Technology Microwave Sounder (ATMS), on board of Soumi-NPP. Results indicate a lag-correlation of 2 to 6 days.

  16. Safety of biologics approved for treating rheumatoid arthritis: analysis of spontaneous reports of adverse events.

    PubMed

    Mendes, Diogo; Alves, Carlos; Batel Marques, Francisco

    2013-08-01

    Despite the effectiveness of biologics approved for the treatment of rheumatoid arthritis, they have been associated with serious adverse events (AEs). Biologics are used under close supervision of health care professionals. In Portugal, they are legally required to report AEs occurring during the treatment. This study aims at investigating post-marketing safety monitoring data of biologics in Portugal by comparing the frequency of spontaneously reported adverse events between 2009 and 2011 with the frequency of such events in the summary of the product characteristics of each biologic. Sales data for biologics were obtained from IMS Health and converted into defined daily doses/1,000 inhabitants/day in order to estimate a proportion of the population treated. The frequency of AEs was estimated as the percentage of patients in which an AE may have occurred. The use of each biologic was estimated for adalimumab at 1,439 patients/year, etanercept 1,944 patients/year, and infliximab 3,211 patients/year. A total of 992 AEs were reported: 207 for adalimumab, 199 for etanercept, and 586 for infliximab. Of the 515 different spontaneously reported AEs, 194 were included for comparisons with the SPCs. Of those, 31 (16 %) were similarly frequent, and 163 (84.0 %) occurred less frequently compared with SPCs' data. These results suggest an insufficient post-marketing safety monitoring of biologics in Portugal. PMID:23604594

  17. Solar flare protection for manned lunar missions - Analysis of the October 1989 proton flare event

    SciTech Connect

    Simonsen, L.C.; Nealy, J.E.; Townsend, L.W.; Sauer, H.H. NOAA, Space Environment Laboratory, Boulder, CL )

    1991-07-01

    Several large solar proton events occurred in the latter half of 1989. For a moderately shielded spacecraft in free space, the potential exposure would have been greatest for the flare which occurred between October 19 to 27, 1989. The temporal variations of the proton energy spectra at approximately 1 AU were monitored by the GOES-7 satellite. These data, recorded and processed at the NOAA-Boulder Space Environment Laboratory, provide the opportunity to analyze dose rates and cumulative doses which might be incurred by astronauts in transit to, or on, the moon. Of particular importance in such an event is the time development of exposure in the early phases of the flare, for which dose rates may range over many orders of magnitude in the first few hours. The cumulative dose as a function of time for the entire event is also predicted. In addition to basic shield calculations, dose rate contours are constructed for flare shelters in free-space and on the lunar surface. 14 refs.

  18. A European digital accelerometric database: statistical analysis of engineering parameters of small to moderate magnitude events

    NASA Astrophysics Data System (ADS)

    Oliveira, Carlos S.; Gassol, Gerard; Goula, Xavier; Susagna, Teresa

    2014-12-01

    During the NERIES Project, an accelerometric database containing European digital information was developed. Besides event and station metadata, ground motion parameters, computed in a homogeneous manner, were assembled: PGA, PGV, AI, TD, CAV, HI and PSV( f,5%) (19,961 components, 2629 events, 547 stations). Merging small and moderate magnitude events produced a unique database capable of providing important information such as: (i) Correlations between several ground motion parameters follow analogous trends as in previous worldwide datasets, with slight corrections. (ii) Although PGA attenuations with distance show great uncertainties, four recent GMPEs recommended for Europe fit quite well the central 50% data interval for the distance range 10 < R < 200 km; outside these distances, they do not fit. (iii) Soil amplification ratios indicate that weak motion (low magnitudes and larger distances) shows larger amplification than strong motion (short distances and large magnitudes) as represented in UBC97 for the USA, but not in EC8 for Europe. (iv) Average spectral shapes are smaller than in the EC8. (v) Differences in amplification factors for PGA, PGV and HI for EC8 soil classes B and C, and differences in spectral shapes for these soil classes, indicate that EC8, Type 2 S-coefficient should be frequency dependent, as in UBC97.

  19. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  20. Modeling and analysis of transient vehicle underhood thermo- hydrodynamic events using computational fluid dynamics and high performance computing.

    SciTech Connect

    Tentner, A.; Froehle, P.; Wang, C.; Nuclear Engineering Division

    2004-01-01

    This work has explored the preliminary design of a Computational Fluid Dynamics (CFD) tool for the analysis of transient vehicle underhood thermo-hydrodynamic events using high performance computing platforms. The goal of this tool will be to extend the capabilities of an existing established CFD code, STAR-CD, allowing the car manufacturers to analyze the impact of transient operational events on the underhood thermal management by exploiting the computational efficiency of modern high performance computing systems. In particular, the project has focused on the CFD modeling of the radiator behavior during a specified transient. The 3-D radiator calculations were performed using STAR-CD, which can perform both steady-state and transient calculations, on the cluster computer available at ANL in the Nuclear Engineering Division. Specified transient boundary conditions, based on experimental data provided by Adapco and DaimlerChrysler were used. The possibility of using STAR-CD in a transient mode for the entire period of time analyzed has been compared with other strategies which involve the use of STAR-CD in a steady-state mode at specified time intervals, while transient heat transfer calculations would be performed for the rest of the time. The results of these calculations have been compared with the experimental data provided by Adapco/DaimlerChrysler and recommendations for future development of an optimal strategy for the CFD modeling of transient thermo-hydrodynamic events have been made. The results of this work open the way for the development of a CFD tool for the transient analysis of underhood thermo-hydrodynamic events, which will allow the integrated transient thermal analysis of the entire cooling system, including both the engine block and the radiator, on high performance computing systems.

  1. Modeling and analysis of transient vehicle underhood thermo - hydrodynamic events using computational fluid dynamics and high performance computing.

    SciTech Connect

    Froehle, P.; Tentner, A.; Wang, C.

    2003-09-05

    This work has explored the preliminary design of a Computational Fluid Dynamics (CFD) tool for the analysis of transient vehicle underhood thermo-hydrodynamic events using high performance computing platforms. The goal of this tool will be to extend the capabilities of an existing established CFD code, STAR-CD, allowing the car manufacturers to analyze the impact of transient operational events on the underhood thermal management by exploiting the computational efficiency of modern high performance computing systems. In particular, the project has focused on the CFD modeling of the radiator behavior during a specified transient. The 3-D radiator calculations were performed using STAR-CD, which can perform both steady-state and transient calculations, on the cluster computer available at ANL in the Nuclear Engineering Division. Specified transient boundary conditions, based on experimental data provided by Adapco and DaimlerChrysler were used. The possibility of using STAR-CD in a transient mode for the entire period of time analyzed has been compared with other strategies which involve the use of STAR-CD in a steady-state mode at specified time intervals, while transient heat transfer calculations would be performed for the rest of the time. The results of these calculations have been compared with the experimental data provided by Adapco/DaimlerChrysler and recommendations for future development of an optimal strategy for the CFD modeling of transient thermo-hydrodynamic events have been made. The results of this work open the way for the development of a CFD tool for the transient analysis of underhood thermo-hydrodynamic events, which will allow the integrated transient thermal analysis of the entire cooling system, including both the engine block and the radiator, on high performance computing systems.

  2. Getting the right blood to the right patient: the contribution of near-miss event reporting and barrier analysis.

    PubMed

    Kaplan, H S

    2005-11-01

    Safety and reliability in blood transfusion are not static, but are dynamic non-events. Since performance deviations continually occur in complex systems, their detection and correction must be accomplished over and over again. Non-conformance must be detected early enough to allow for recovery or mitigation. Near-miss events afford early detection of possible system weaknesses and provide an early chance at correction. National event reporting systems, both voluntary and involuntary, have begun to include near-miss reporting in their classification schemes, raising awareness for their detection. MERS-TM is a voluntary safety reporting initiative in transfusion. Currently 22 hospitals submit reports anonymously to a central database which supports analysis of a hospital's own data and that of an aggregate database. The system encourages reporting of near-miss events, where the patient is protected from receiving an unsuitable or incorrect blood component due to a planned or unplanned recovery step. MERS-TM data suggest approximately 90% of events are near-misses, with 10% caught after issue but before transfusion. Near-miss reporting may increase total reports ten-fold. The ratio of near-misses to events with harm is 339:1, consistent with other industries' ratio of 300:1, which has been proposed as a measure of reporting in event reporting systems. Use of a risk matrix and an event's relation to protective barriers allow prioritization of these events. Near-misses recovered by planned barriers occur ten times more frequently then unplanned recoveries. A bedside check of the patient's identity with that on the blood component is an essential, final barrier. How the typical two person check is performed, is critical. Even properly done, this check is ineffective against sampling and testing errors. Blood testing at bedside just prior to transfusion minimizes the risk of such upstream events. However, even with simple and well designed devices, training may be a

  3. Forensic Analysis of Seismic Events in the Water; Submarines, Explosions and Impacts

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.; Koper, K. D.

    2002-12-01

    Sudden pressure changes in a water column can generate significant seismic energy that may be recorded on land based seismometers. In recent years a number of accidents and chemical explosions in the ocean or large lakes have been recorded at teleseismic distances, affording the opportunity to investigate the seismic source. The August 2000 sinking of the Russian attack submarine Kursk is the most famous example of an accident at sea that seismology played a role in understanding, but many other exist including: (1) the 1989 sinking and apparent implosion of the Soviet submarine Komsomoltes, (2) the sudden sinking of a large oil drilling platform in the North Sea in 1991, (3) the 1972 explosion and sinking of a 700 ton cargo ship off the coast of southwestern England, and (4) the crash of Swissair Flight 111 off the coast of Nova Scotia in 1998. Enough empirical information has been collected to accurately characterize the size of most of these underwater (or on the surface of the water) events. Further, many of the seismic signals contain a spectral scalloping that can be interpreted as either as reverberation of seismic energy in the water column or bubble pulses from underwater explosions. This information can be used to constrain the details of the seismic source. For example, the Kursk explosion had a pronounced spectral scalloping with a 1.45 Hz banding. Using a relationship between bubble pulse frequency, explosive yield and depth of detonation (the relationship was developed and verified using a large population of chemical explosions in the 1940s), the Kursk detonation is estimated to be at a depth of 85-100 m, with a yield of 3-5 tonnes equivalent TNT. This seismic result was confirmed almost exactly by the Russian government with the release of the official accident on the Kursk in August 2002. Seismic events in the water column can be rich sources of information about the details of the source. Events as small as magnitude 1.2 are routinely recorded by

  4. Effect of Statins on Venous Thromboembolic Events: A Meta-analysis of Published and Unpublished Evidence from Randomised Controlled Trials

    PubMed Central

    Rahimi, Kazem; Bhala, Neeraj; Kamphuisen, Pieter; Emberson, Jonathan; Biere-Rafi, Sara; Krane, Vera; Robertson, Michele; Wikstrand, John; McMurray, John

    2012-01-01

    Background It has been suggested that statins substantially reduce the risk of venous thromboembolic events. We sought to test this hypothesis by performing a meta-analysis of both published and unpublished results from randomised trials of statins. Methods and Findings We searched MEDLINE, EMBASE, and Cochrane CENTRAL up to March 2012 for randomised controlled trials comparing statin with no statin, or comparing high dose versus standard dose statin, with 100 or more randomised participants and at least 6 months' follow-up. Investigators were contacted for unpublished information about venous thromboembolic events during follow-up. Twenty-two trials of statin versus control (105,759 participants) and seven trials of an intensive versus a standard dose statin regimen (40,594 participants) were included. In trials of statin versus control, allocation to statin therapy did not significantly reduce the risk of venous thromboembolic events (465 [0.9%] statin versus 521 [1.0%] control, odds ratio [OR] = 0.89, 95% CI 0.78–1.01, p = 0.08) with no evidence of heterogeneity between effects on deep vein thrombosis (266 versus 311, OR 0.85, 95% CI 0.72–1.01) and effects on pulmonary embolism (205 versus 222, OR 0.92, 95% CI 0.76–1.12). Exclusion of the trial result that provided the motivation for our meta-analysis (JUPITER) had little impact on the findings for venous thromboembolic events (431 [0.9%] versus 461 [1.0%], OR = 0.93 [95% CI 0.82–1.07], p = 0.32 among the other 21 trials). There was no evidence that higher dose statin therapy reduced the risk of venous thromboembolic events compared with standard dose statin therapy (198 [1.0%] versus 202 [1.0%], OR = 0.98, 95% CI 0.80–1.20, p = 0.87). Risk of bias overall was small but a certain degree of effect underestimation due to random error cannot be ruled out. Please see later in the article for the Editors' Summary. Conclusions The findings from this meta-analysis do not support the

  5. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  6. Remote sensing analysis of the Tiber River sediment plume (Tyrrhenian Sea): spectral signature of erratic vs. persistent events

    NASA Astrophysics Data System (ADS)

    Falcini, Federico; Di Cicco, Annalisa; Pitarch, Jaime; Marullo, Salvatore; Colella, Simone; Volpe, Gianluca; Nardin, William; Margiotta, Francesca; Santoleri, Rosalia

    2016-04-01

    During the last decade, several regions along the western Tyrrhenian coast have been dramatically affected by intense river runoffs, which delivered a significant amount of sediment off and along shore. A crucial question that coastal geomorphologists and marine scientists need to face is about the fate and impact of this impulsive sediment load, especially with respect to the historical trend, seasonal variability, and persistent events. A satellite-based analysis of these sediment discharges is a key ingredient for such a study since it represents the primary dataset for the recognition of coastal patterns of Total Suspended Matter (TSM) that may reflect erosional or depositional processes along the coats. On this regard, we developed and implemented a TSM regional product from remote sensing, which was calibrated and validated by in situ measurements collected in the Tyrrhenian Sea. We discuss spatial patterns and spectral signature of the TSM that we observe during the 2012 high river discharge event of the Tiber River. Our analysis gives some insights on the main differences of the geomorphological impacts related to erratic vs persistent events.

  7. Synoptic-mesoscale analysis and numerical modeling of a tornado event on 12 February 2010 in northern Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, I. T.; Nastos, P. T.; Pytharoulis, I.

    2011-07-01

    Tornadoes are furious convective weather phenomena, with the maximum frequency over Greece during the cold period (autumn, winter).This study analyzes the tornado event that occurred on 12 February 2010 near Vrastama village, at Chalkidiki's prefecture, a non urban area 45 km southeast of Thessaloniki in northern Greece. The tornado developed approximately between 17:10 and 17:35 UTC and was characterized as F2 (Fujita Scale). The tornado event caused several damages to an industrial building and at several olive-tree farms. A synoptic survey is presented along with satellite images, radar products and vertical profile of the atmosphere. Additionally, the nonhydrostatic WRF-ARW atmospheric numerical model (version 3.2.0) was utilized in analysis and forecast mode using very high horizontal resolution (1.333 km × 1.333 km) in order to represent the ambient atmospheric conditions. A comparison of statistical errors between WRF-ARW forecasts and ECMWF analysis is presented, accompanied with LGTS 12:00 UTC soundings (Thessaloniki Airport) and forecast soundings in order to verify the WRF-ARW model. Additionally, a comparison between WRF-ARW and ECMWF thermodynamic indices is also presented. The WRF-ARW high spatial resolution model appeared to simulate with significant accuracy a severe convective event with a lead period of 18 h.

  8. A user`s guide to GAETR: Sandia`s {open_quotes}Graphical Analysis of Event Trees{close_quotes} software

    SciTech Connect

    Hays, K.M.

    1997-09-01

    This document is a reference guide for GAETR, Graphical Analysis of Event Trees, a software package developed at Sandia National Laboratories. GAETR may be used as a stand-alone code or as a module in the ARRAMIS{trademark} risk and reliability code suite. GAETR is designed to graphically create event trees and plot SETAC (Sandia Event Tree Analysis Code) output on IBM-compatible personal computers using the Microsoft{reg_sign} Windows{trademark} 95/NT operating environment. This manual explains the fundamentals of creating an event tree, including formatting, saving sequence information, printing, editing, and importing graphics to other software packages.

  9. Event based analysis of chlorothalonil concentrations following application to managed turf.

    PubMed

    King, Kevin W; Balogh, James C

    2013-03-01

    Chlorothalonil concentrations exceeding acute toxicity levels for certain organisms have been measured in surface water discharge events from managed turf watersheds. The duration of exceedence and the timing of these events related to precipitation/runoff and time since application, however, have not been explored. Chlorothalonil concentrations were measured from discharge waters draining a managed turf watershed in Duluth, Minnesota, USA, between 2003 and 2009. The median chlorothalonil concentration was 0.58 µg/L. Approximately 2% of all measured concentrations exceeded the 7.6 µg/L median lethal concentration (LC50) acute toxicity level for rainbow trout. One-twentieth the LC50 concentration, equivalent to the level of concern (0.38 µg/L) for endangered species, was exceeded 31% of the time during the present study. The concentrations that exceeded the LC50 threshold were associated with eight rainfall/runoff events. Low dose exposures are a more important biological concern than acute occurrences. Exceedence concentrations associated with acute effects were significantly (p < 0.05) correlated to time since application and were measured only in the fall following extensive application. A conflict exists between the transportability of chlorothalonil as suggested by its chemical properties and the data collected in the present study. With respect to course-wide golf course application, avoiding application until after the major autumn rainfall period but before the first snow coverage is recommended to reduce occurrence of chlorothalonil concentrations that exceed toxic levels associated with acute and chronic levels of concern. PMID:23233324

  10. Analysis of the March 30, 2011 Hail Event at Shuttle Launch Pad 39A

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Doesken, Nolan J.; Kasparis, Takis C.; Sharp, David W.

    2012-01-01

    The Kennedy Space Center (KSC) Hail Monitor System, a joint effort of the NASA KSC Physics Lab and the KSC Engineering Services Contract (ESC) Applied Technology Lab, was first deployed for operational testing in the fall of 2006. Volunteers from the Community Collaborative Rain, Hail, and Snow Network (CoCoRaHS) in conjunction with Colorado State University have been instrumental in validation testing using duplicate hail monitor systems at sites in the hail prone high plains of Colorado. The KSC Hail Monitor System (HMS), consisting of three stations positioned approximately 500 ft from the launch pad and forming an approximate equilateral triangle, as shown in Figure 1, was first deployed to Pad 39B for support of STS-115. Two months later, the HMS was deployed to Pad 39A for support of STS-116. During support of STS-117 in late February 2007, an unusually intense (for Florida standards) hail event occurred in the immediate vicinity of the exposed space shuttle and launch pad. Hail data of this event was collected by the HMS and analyzed. Support of STS-118 revealed another important application of the hail monitor system. Ground Instrumentation personnel check the hail monitors daily when a vehicle is on the launch pad, with special attention after any storm suspected of containing hail. If no hail is recorded by the HMS, the vehicle and pad inspection team has no need to conduct a thorough inspection of the vehicle immediately following a storm. On the afternoon of July 13, 2007, hail on the ground was reported by observers at the Vertical Assembly Building (VAB) and Launch Control Center (LCC), about three miles west of Pad 39A, as well as at several other locations at KSC. The HMS showed no impact detections, indicating that the shuttle had not been damaged by any of the numerous hail events which occurred on that day.

  11. Quantitative analysis of long-period events recorded during hydrofracture experiments at Fenton Hill, New Mexico

    NASA Astrophysics Data System (ADS)

    Ferrazzini, Valerie; Chouet, Bernard; Fehler, Mike; Aki, Keiiti

    1990-12-01

    A three-dimensional fluid-filled crack model recently developed by Chouet is used to reproduce and explain the spectral characteristics of different classes of long-period events recorded during a hydrofracture experiment conducted at Fenton Hill, New Mexico. We study the dependence on the model parameters of the far-field P-wave radiation due to the vibration of the fluid-filled crack. Those parameters are given by the properties of the fluid and solid, the crack dimensions, the area and location of the crack surface over which the excess pressure is applied, the time history of this excess pressure, and the station location. In this model, the resonance of the crack is sustained by a very slow and dispersive wave called "crack wave". The phase velocity of the crack wave depends critically on the impedance contrast between fluid and solid and on the crack dimensions. We are able to fit the dominant features of the Fenton Hill data in the time and frequency domains and draw inferences on the impedance contrast between the fluid and solid. The various classes of events observed can be modeled by a single crack over which the geometry of the applied excess pressure changes. The length, width, and thickness of the crack are estimated to be on the order of 3 m, 1 m, and 3 mm, respectively. The observed spectral roll-off can be explained by a ramp function time dependence of the pressure transient. The rise time necessary to simulate the observed data varies between 2 and 4 ms, depending on the event considered. Assuming the source-receiver distance of 700 m, the amplitude of displacement of the data agree with the model's prediction if the excess pressure applied on the crack is on the order of 20 bars.

  12. Spatial analysis of a large magnitude erosion event following a Sierran wildfire.

    PubMed

    Carroll, Erin M; Miller, Wally W; Johnson, Dale W; Saito, Laurel; Qualls, Robert G; Walker, Roger F

    2007-01-01

    High intensity wildfire due to long-term fire suppression and heavy fuels buildup can render watersheds highly susceptible to wind and water erosion. The 2002 "Gondola" wildfire, located just southeast of Lake Tahoe, NV-CA, was followed 2 wk later by a severe hail and rainfall event that deposited 7.6 to 15.2 mm of precipitation over a 3 to 5 h time period. This resulted in a substantive upland ash and sediment flow with subsequent down-gradient riparian zone deposition. Point measurements and ESRI ArcView were applied to spatially assess source area contributions and the extent of ash and sediment flow deposition in the riparian zone. A deposition mass of 380 Mg of ash and sediment over 0.82 ha and pre-wildfire surface bulk density measurements were used in conjunction with two source area assessments to generate an estimation of 10.1 mm as the average depth of surface material eroded from the upland source area. Compared to previous measurements of erosion during rainfall simulation studies, the erosion of 1800 to 6700 g m(-2) mm(-1) determined from this study was as much as four orders of magnitude larger. Wildfire, followed by the single event documented in this investigation, enhanced soil water repellency and contributed 17 to 67% of the reported 15 to 60 mm ky(-1) of non-glacial, baseline erosion rates occurring in mountainous, granitic terrain sites in the Sierra Nevada. High fuel loads now common to the Lake Tahoe Basin increase the risk that similar erosion events will become more commonplace, potentially contributing to the accelerated degradation of Lake Tahoe's water clarity. PMID:17526890

  13. Analysis of a Monsoon Flood Event Effect on Surface and Groundwater Interactions in a Regional Semiarid Watershed

    NASA Astrophysics Data System (ADS)

    Bowman, R. S.; Vivoni, E. R.; Wyckoff, R.; Jakubowski, R.; Richards, K.

    2004-12-01

    Although sporadic and infrequent, flooding events in ephemeral watersheds are a critical component to the water, sediment and biogeochemical cycles in arid and semiarid regions. In the Southwestern United States, intense thunderstorms during the summer monsoon season interact with landscapes characterized by topographic complexity and soils of low infiltration capacity to produce large magnitude floods and flash floods. In this study, we examine the hydrometeorological conditions and hydrologic response of an extreme monsoon flood event in the Río Puerco watershed of north-central New Mexico and its downstream effects in the Río Grande, a major continental-scale river basin. The summer storm in early September 4-11, 2003 generated flash flooding in headwater basins and river flooding extending through the semiarid basin and downstream into the Río Grande for several tens of kilometers. We characterize the hydrometeorological conditions prior to the flood event using precipitation estimates from rain gauge records, NEXRAD radar data, and synoptic weather conditions over the 18,000 km2 Río Puerco basin. Then, we present the spatial and temporal variability in hydrologic response based on a set of nested stream gauges in river channels and irrigation canals as well as a network of instrumented well transects installed along the Río Grande alluvial aquifer. Our analysis illustrates the propagation, dampening, and attenuation of a large monsoonal storm through a semiarid ephemeral tributary into a regional river system from both a surface and groundwater hydrology perspective, including the water exchanges observed between the two systems. By estimating the frequency of the rainfall and flood event in the system relative to the historical record and known shifts in climate regime, we discuss the importance of extreme flood events in semiarid tributary systems and their downstream effects in the surface and groundwater interactions of regional river basins.

  14. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  15. The Strong Wind event of 24th January 2009 in Catalonia: a social impact analysis

    NASA Astrophysics Data System (ADS)

    Amaro, J.; Aran, M.; Barberia, L.; Llasat, M. C.

    2009-09-01

    Although strong winds are frequent in Catalonia, one of the events with the strongest impact in recent years was on January 24th 2009. An explosive cyclogenesis process took place in the Atlantic: pressure fell 30 hPa in less than 24 hours. The strong wind storm pounded the northern of Spain and the south of France with some fatalities and important economic losses in these regions. Several automatic weather stations recorded wind gusts higher than 100 km/h in Catalonia. Emergency services received more than 20.000 calls in 24 hours and there were 497 interventions in only 12 hours. As a consequence of fallen and uprooted trees railway and road infrastructures got damages and more than 30.000 customers had no electricity during 24 hours. Unfortunately there were a total of 6 fatalities, two of them because of fallen trees and the other ones when a sports centre collapsed over a group of children. In Spain, insurance policies cover damages due to strong winds when fixed thresholds are overcome and, according to the Royal Decree 300/2004 of 20th February, extraordinary risk are assumed by the Consorcio de Compensación de Seguros. Subsequently, Public Weather Services (PWS) had an increased on the number of requests received from people affected by this event and from insurance companies, for the corresponding indemnity or not. As an example, during the first month after the event, in the Servei Meteorològic de Catalunya (SMC) more than 600 requests were received only related to these damages (as an average PWS of SMC received a total of 400 requests per month). Following the research started by the Social Impact Research Group of MEDEX project, a good vulnerability indicator of a meteorological risk can be the number of requests reported. This study uses the information received in the PWS of the SMC during the six months after the event, according the criteria and methodology established in Gayà et al (2008). The objective is to compare the vulnerability with the

  16. Bayesian analysis of recurrent event with dependent termination: an application to a heart transplant study.

    PubMed

    Ouyang, Bichun; Sinha, Debajyoti; Slate, Elizabeth H; Van Bakel, Adrian B

    2013-07-10

    For a heart transplant patient, the risk of graft rejection and risk of death are likely to be associated. Two fully specified Bayesian models for recurrent events with dependent termination are applied to investigate the potential relationships between these two types of risk as well as association with risk factors. We particularly focus on the choice of priors, selection of the appropriate prediction model, and prediction methods for these two types of risk for an individual patient. Our prediction tools can be easily implemented and helpful to physicians for setting heart transplant patients' biopsy schedule. PMID:23280968

  17. Quasifree expansion picture of break-up events: An analysis of ionizing systems

    SciTech Connect

    Errea, L.F.; Mendez, L.; Pons, B.; Riera, A.; Sevila, I.

    2003-02-01

    We derive some general characteristics of the wave function representing a break-up event, in the asymptotic region. They have a strong bearing on the validity of some classical pictures, on the correlation between spatial and momentum variables that develops in the course of the dissociation process and on stringent requirements on the basis sets that are employed to approximate the wave function. Although other calculations are mentioned to underline the generality of our reasonings, we restrict most of the presentation, and all of the illustrations, to the case of ionization.

  18. Kinematics from footprints: Analysis of a possible dinosaur predation event in the Cretaceous Era

    NASA Astrophysics Data System (ADS)

    Lee, Scott

    2008-10-01

    Motivation is enhanced by challenging students with interesting and open-ended questions. In this talk, a methodology for studying the locomotion of extinct animals based on their footprint trackways is developed and applied to a possible predation event recorded in a Cretaceous Era deposit.ootnotetextJ.O. Farlow, ``Lower Cretaceous Dinosaur Tracks, Paluxy River Valley, Texas,'' South Central Geological Society of America, Baylor University, 1987. Students usually love learning about dinosaurs, an unexpected treat in a physics class. This example can be used in the classroom to help build critical thinking skills as the students decide whether the evidence supports a predation scenario or not.

  19. Markovian Statistical Data Analysis of Single-Event Upsets Triggered by High Intensity Neutrons

    NASA Technical Reports Server (NTRS)

    Lakdawala, Anushka V.; Zhang, Hong; Gonzalex, Oscar R.; Gray, W. Steven

    2006-01-01

    This paper analyzes data from a single-event upset experiment conducted at the Los Alamos National Laboratory. Statistical tools, based on well-known x(sup 2) hypothesis testing theory, are used to determine if sequences of upsets can be modeled as a homogeneous Markov chain of a specific order. The experiment consisted of radiating a new experimental flight control computer (FCC) with a high intensity neutron beam while the FCC controlled a simulation of a Boeing 737. The analyzed data is a sequence of states that indicates when the FCC is under an upset condition.

  20. Time-to-event analysis as a framework for quantifying fish passage performance: Chapter 9.1

    USGS Publications Warehouse

    Castro-Santos, Theodore R.; Perry, Russell W.

    2012-01-01

    Fish passage is the result of a sequence of processes, whereby fish must approach, enter, and pass a structure. Each of these processes takes time, and fishway performance is best quantified in terms of the rates at which each process is completed. Optimal performance is achieved by maximizing the rates of approach, entry, and passage through safe and desirable routes. Sometimes, however, it is necessary to reduce rates of passage through less desirable routes in order to increase proportions passing through the preferred route. Effectiveness of operational or structural modifications for achieving either of these goals is best quantified by applying time-to-event analysis, commonly known as survival analysis methods, to telemetry data. This set of techniques allows for accurate estimation of passage rates and covariate effects on those rates. Importantly, it allows researchers to quantify rates that vary over time, as well as the effects of covariates that also vary over time. Finally, these methods are able to control for competing risks, i.e., the presence of alternate passage routes, failure to pass, or other fates that remove fish from the pool of candidates available to pass through a particular route. In this chapter, we present a model simulation of telemetered fish passing a hydroelectric dam, and provide step-by-step guidance and rationales for performing time-to-event analysis on the resulting data. We demonstrate how this approach removes bias from performance estimates that can result from using methods that focus only on proportions passing each route. Time-to-event analysis, coupled with multinomial models for measuring survival, provides a comprehensive set of techniques for quantifying fish passage, and a framework from which performance among different sites can be better understood.

  1. Incidence of adverse events in paediatric procedural sedation in the emergency department: a systematic review and meta-analysis

    PubMed Central

    Bellolio, M Fernanda; Puls, Henrique A; Anderson, Jana L; Gilani, Waqas I; Murad, M Hassan; Barrionuevo, Patricia; Erwin, Patricia J; Wang, Zhen; Hess, Erik P

    2016-01-01

    Objective and design We conducted a systematic review and meta-analysis to evaluate the incidence of adverse events in the emergency department (ED) during procedural sedation in the paediatric population. Randomised controlled trials and observational studies from the past 10 years were included. We adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Setting ED. Participants Children. Interventions Procedural sedation. Outcomes Adverse events like vomiting, agitation, hypoxia and apnoea. Meta-analysis was performed with random-effects model and reported as incidence rates with 95% CIs. Results A total of 1177 studies were retrieved for screening and 258 were selected for full-text review. 41 studies reporting on 13 883 procedural sedations in 13 876 children (≤18 years) were included. The most common adverse events (all reported per 1000 sedations) were: vomiting 55.5 (CI 45.2 to 65.8), agitation 17.9 (CI 12.2 to 23.7), hypoxia 14.8 (CI 10.2 to 19.3) and apnoea 7.1 (CI 3.2 to 11.0). The need to intervene with either bag valve mask, oral airway or positive pressure ventilation occurred in 5.0 per 1000 sedations (CI 2.3 to 7.6). The incidences of severe respiratory events were: 34 cases of laryngospasm among 8687 sedations (2.9 per 1000 sedations, CI 1.1 to 4.7; absolute rate 3.9 per 1000 sedations), 4 intubations among 9136 sedations and 0 cases of aspiration among 3326 sedations. 33 of the 34 cases of laryngospasm occurred in patients who received ketamine. Conclusions Serious adverse respiratory events are very rare in paediatric procedural sedation in the ED. Emesis and agitation are the most frequent adverse events. Hypoxia, a late indicator of respiratory depression, occurs in 1.5% of sedations. Laryngospasm, though rare, happens most frequently with ketamine. The results of this study provide quantitative risk estimates to facilitate shared decision-making, risk communication, informed consent and

  2. Source space analysis of event-related dynamic reorganization of brain networks.

    PubMed

    Ioannides, Andreas A; Dimitriadis, Stavros I; Saridis, George A; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A

    2012-01-01

    How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications. PMID:23097678

  3. Source Space Analysis of Event-Related Dynamic Reorganization of Brain Networks

    PubMed Central

    Ioannides, Andreas A.; Dimitriadis, Stavros I.; Saridis, George A.; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A.

    2012-01-01

    How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications. PMID:23097678

  4. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  5. Words Analysis of Online Chinese News Headlines about Trending Events: A Complex Network Perspective

    PubMed Central

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines’ keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words’ networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  6. Analysis of the 4-year IceCube high-energy starting events

    NASA Astrophysics Data System (ADS)

    Vincent, Aaron C.; Palomares-Ruiz, Sergio; Mena, Olga

    2016-07-01

    After four years of data taking, the IceCube neutrino telescope has detected 54 high-energy starting events (HESE, or contained-vertex events) with deposited energies above 20 TeV. They represent the first detection of high-energy extraterrestrial neutrinos and, therefore, the first step in neutrino astronomy. To study the energy, flavor, and isotropy of the astrophysical neutrino flux arriving at Earth, we perform different analyses of two different deposited energy intervals, [10 TeV-10 PeV] and [60 TeV-10 PeV]. We first consider an isotropic unbroken power-law spectrum and constrain its shape, normalization, and flavor composition. Our results are in agreement with the preliminary IceCube results, although we obtain a slightly softer spectrum. We also find that current data are not sensitive to a possible neutrino-antineutrino asymmetry in the astrophysical flux. Then, we show that although a two-component power-law model leads to a slightly better fit, it does not represent a significant improvement with respect to a single power-law flux. Finally, we analyze the possible existence of a north-south asymmetry, hinted at by the combination of the HESE sample with the throughgoing muon data. If we use only HESE data, the scarce statistics from the Northern Hemisphere does not allow us to reach any conclusive answer, which indicates that the HESE sample alone is not driving the potential north-south asymmetry.

  7. Possible Detection of Volcanic Activity on Europa: Analysis of An Optical Transient Event

    NASA Astrophysics Data System (ADS)

    de La Fuente Marcos, R.; Nissar, A.

    2002-06-01

    Europa's low crater density suggests that geological activity has continued to the present epoch, leading to the possibility that current resurfacing events might be detectable. CCD observations were carried out with a ST-6 camera at the 0.5 m Mons Cassegrain telescope (Izaña Observatory, Tenerife,Canary Islands, Spain) during the night between 2 3 October 1999. Our images show a transient bright feature on the Galilean satellite. These images are analyzed here with the purpose of understanding the nature of the transient phenomena as it could be the result of explosive venting on the surface of the Jovian satellite. By comparison, we use NASA Infrared Telescope Facility images of two Io hot spots taken on12 October 1990. Although we mainly restrict our discussion on apossible eruptive nature of the observed spots, we also consider other alternative mechanisms able to produce bright events. In particular, an interaction between charged material being ejected from Europa and the Jovian magnetosphere cannot be entirely ruled out. If confirmed, this result would lend support for the existence of active resurfacing in Europa.

  8. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    PubMed

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  9. Landscape-Scale Analysis of Wetland Sediment Deposition from Four Tropical Cyclone Events

    PubMed Central

    Tweel, Andrew W.; Turner, R. Eugene

    2012-01-01

    Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between events. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone events and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils. PMID:23185635

  10. Three-dimensional analysis of charging events on days 87 and 114, 1979, from SCATHA

    NASA Technical Reports Server (NTRS)

    Saflekos, N. A.; Tautz, M. F.; Rubin, A. G.; Hardy, D. A.; Mizera, P. F.; Feynman, J.

    1980-01-01

    Angular distributions of ions and electrons from the Spacecraft Charging at High Altitudes (SCATHA) were investigated for the floating potential and the differential charging of the spacecraft as deduced from Liouville's theorem. The following was found: (1) short time charging events on the spacecraft are associated with short time increases of the intensity of 10 keV to 1 MeV electrons; (2) short time changes of the spacecraft differential potential are associated with simultaneous short time changes of the spacecraft floating potential; (3) solar UV intensities in penumbra anticorrelate with the spacecraft floating potentials; (4) NASCAP predicts correct forms of sunshade asymmetric surface potentials; (5) certain enhancements of the intensity of energetic ions diminishes the absolute value of the spacecraft surface potential; (6) spacecraft discharging events in times shorter than 20 sec did not change in the spectrum of the energetic plasma; (7) partial discharging of the spacecraft occurred upon entry into a magnetically depleted region; and (8) steady state potentials and transient potentials of duration less than 30 seconds are simulated by the NASCAP code.

  11. [Analysis of the cardiac side effects of antipsychotics: Japanese Adverse Drug Event Report Database (JADER)].

    PubMed

    Ikeno, Takashi; Okumara, Yasuyuki; Kugiyama, Kiyotaka; Ito, Hiroto

    2013-08-01

    We analyzed the cases of side effects due to antipsychotics reported to Japan's Pharmaceuticals and Medical Devices Agency (PMDA) from Jan. 2004 to Dec. 2012. We used the Japanese Adverse Drug Event Report Database (JADER) and analyzed 136 of 216,945 cases using the defined terms. We also checked the cardiac adverse effects listed in the package inserts of the antipsychotics involved. We found cases of Ikr blockade resulting in sudden death (49 cases), electrocardiogram QT prolonged (29 cases), torsade de pointes (TdP, 19 cases), ventricular fibrillation (VF, 10 cases). M2 receptor blockade was observed in tachycardia (8 cases) and sinus tachycardia (3 cases). Calmodulin blockade was involved in reported cardiomyopathy (3 cases) and myocarditis (1 case). Multiple adverse events were reported simultaneously in 14 cases. Our search of package inserts revealed warnings regarding electrocardiogram QT prolongation (24 drugs), tachycardia (23), sudden death (18), TdP (14), VF (3), myocarditis (1) and cardiomyopathy (1). We suggest that when an antipsychotic is prescribed, the patient should be monitored regularly with ECG, blood tests, and/or biochemical tests to avoid adverse cardiac effects. PMID:25069255

  12. Climate change impact and uncertainty analysis of extreme rainfall events in the Apalachicola River basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, Dingbao; Hagen, Scott C.; Alizad, Karim

    2013-02-01

    SummaryClimate change impact on rainfall intensity-duration-frequency (IDF) curves at the Apalachicola River basin (Florida Panhandle coast) is assessed using an ensemble of regional climate models (RCMs) obtained from the North American Regional Climate Change Assessment Program. The suitability of seven RCMs on simulating temporal variation of rainfall at the fine-scale is assessed for the case study region. Two RCMs, HRM3-HADCM3 and RCM3-GFDL, are found to have good skill scores in generating high intensity events at the mid-afternoon (2:00-4:00 PM). These two RCMs are selected for assessing potential climate change impact on IDF curves. Two methods are used to conduct bias correction on future rainfall IDF curves, i.e., maximum intensity percentile-based method, and sequential bias correction and maximum intensity percentile-based method. Based on the projection by HRM3-HADCM3, there is no significant change in rainfall intensity at the upstream and middle stream stations but higher intensity at the downstream station. RCM3-GFDL projected increased rainfall intensity from upstream to downstream, particularly at the downstream. The potential temporal shift of extreme rainfall events coupled with overall increased intensities may exacerbate flood magnitudes and lead to increased sediment and nutrient loadings to the estuary, especially in light of sea level change.

  13. [Incidence rate of adverse reaction/event by Qingkailing injection: a Meta-analysis of single rate].

    PubMed

    Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing

    2015-12-01

    To systematically review the incidence rate of adverse drug reaction/event by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-analysis was performed by using the R 3.2.0 software, subgroup sensitivity analysis was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/events were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/events, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/event type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup analysis showed that different types of adverse reactions, combination in the incidence of adverse reactions/events were higher than that of single drug, the difference was statistically significant (P < 0.05). This study suggested the influence factors of adverse reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication. PMID:27245021

  14. Perioperative outcomes and adverse events of minimally invasive versus open posterior lumbar fusion: meta-analysis and systematic review.

    PubMed

    Goldstein, Christina L; Macwan, Kevin; Sundararajan, Kala; Rampersaud, Y Raja

    2016-03-01

    OBJECT The objective of this study was to determine the clinical comparative effectiveness and adverse event rates of posterior minimally invasive surgery (MIS) compared with open transforaminal or posterior lumbar interbody fusion (TLIF/PLIF). METHODS A systematic review of the Medline, EMBASE, PubMed, Web of Science, and Cochrane databases was performed. A hand search of reference lists was conducted. Studies were reviewed by 2 independent assessors to identify randomized controlled trials (RCTs) or comparative cohort studies including at least 10 patients undergoing MIS or open TLIF/PLIF for degenerative lumbar spinal disorders and reporting at least 1 of the following: clinical outcome measure, perioperative clinical or process measure, radiographic outcome, or adverse events. Study quality was assessed using the Grades of Recommendation, Assessment, Development, and Evaluation (GRADE) protocol. When appropriate, a meta-analysis of outcomes data was conducted. RESULTS The systematic review and reference list search identified 3301 articles, with 26 meeting study inclusion criteria. All studies, including 1 RCT, were of low or very low quality. No significant difference regarding age, sex, surgical levels, or diagnosis was identified between the 2 cohorts (856 patients in the MIS cohort, 806 patients in the open cohort). The meta-analysis revealed changes in the perioperative outcomes of mean estimated blood loss, time to ambulation, and length of stay favoring an MIS approach by 260 ml (p < 0.00001), 3.5 days (p = 0.0006), and 2.9 days (p < 0.00001), respectively. Operative time was not significantly different between the surgical techniques