Science.gov

Sample records for event analysis atheana

  1. The action characterization matrix: A link between HERA (Human Events Reference for ATHEANA) and ATHEANA (a technique for human error analysis)

    SciTech Connect

    Hahn, H.A.

    1997-12-22

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavior science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. ATHEANA is being developed in the context of nuclear power plant (NPP) PRAs, and much of the language used to describe the method and provide examples of its application are specific to that industry. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. Los Alamos National Laboratory`s (LANL) Human Factors Group has recently joined the ATHEANA project team; LANL is responsible for further developing the database structure and for analyzing additional exemplar operational events for entry into the database. The Action Characterization Matrix (ACM) is conceived as a bridge between the HERA database structure and ATHEANA. Specifically, the ACM allows each unsafe action or human failure event to be characterized according to its representation along each of six different dimensions: system status, initiator status, unsafe action mechanism, information processing stage, equipment/material conditions, and performance shaping factors. This report describes the development of the ACM and provides details on the structure and content of its dimensions.

  2. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    SciTech Connect

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  3. Trial application of a technique for human error analysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-10-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context.

  4. Human Events Reference for ATHEANA (HERA) Database Description and Preliminary User's Manual

    SciTech Connect

    Auflick, J.L.

    1999-08-12

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database (db) of analytical operational events, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  5. Human events reference for ATHEANA (HERA) database description and preliminary user`s manual

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Pond, D.J.

    1998-05-27

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  6. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Whitehead, D.W.; Forester, J.A.; Bley, D.C.

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  7. Philosophy of ATHEANA

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A.; Thompson, C.M.; Whitehead, D.W.; Wreathall, J.

    1999-03-24

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  8. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    SciTech Connect

    Taylor, J.H.; Luckas, W.J.; Wreathall, J.

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  9. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    SciTech Connect

    Taylor, J.; O`Hara, J.; Luckas, W.

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  10. Dynamic Analysis of Event Histories.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; And Others

    1979-01-01

    Demonstrates the value of dynamic analysis of event-history data for the sociological study of change in categorical variables. An event history records dates of events that occur for some unit of analysis (i.e., an individual's marital or employment status, or outbreaks of riots or wars). (Author/AV)

  11. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  12. EVENT PLANNING USING FUNCTION ANALYSIS

    SciTech Connect

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  13. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  14. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  15. Collecting operational event data for statistical analysis

    SciTech Connect

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis.

  16. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  17. Event Reports Promoting Root Cause Analysis.

    PubMed

    Pandit, Swananda; Gong, Yang

    2016-01-01

    Improving health is the sole objective of medical care. Unfortunately, mishaps or patient safety events happen during the care. If the safety events were collected effectively, they would help identify patterns, underlying causes, and ultimately generate proactive and remedial solutions for prevention of recurrence. Based on the AHRQ Common Formats, we examine the quality of patient safety incident reports and describe the initial data requirement that can support and accelerate effective root cause analysis. The ultimate goal is to develop a knowledge base of patient safety events and their common solutions which can be readily available for sharing and learning. PMID:27332241

  18. Top Event Matrix Analysis Code System.

    2000-06-19

    Version 00 TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude ofmore » risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates.« less

  19. Dynamic Event Tree Analysis Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  20. Infrasound Event Analysis into the IDC Operations

    NASA Astrophysics Data System (ADS)

    Mialle, Pierrick; Bittner, Paulina; Brachet, Nicolas; Brown, David; Given, Jeffrey; Le Bras, Ronan; Coyne, John

    2010-05-01

    The first atmospheric event built only from infrasound arrivals was reported in the Reviewed Event Bulletin (REB) of the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO) in 2003. In the last decade, 42 infrasound stations from the International Monitoring System (IMS) have been installed and are transmitting data to the IDC. The growing amount of infrasound data and detections produced by the automatic system challenged the station and network processing at the IDC, which required the Organization to redesign the way infrasound data are processed. Each infrasound array is processed separately for signal detection using a progressive multi-channel correlation method (DFX-PMCC). For each detection, signal features - onset time, amplitude, frequency, duration, azimuth, phase velocity, F-statistics - are measured and used to identify a detection as infrasonic, seismic, or noise (including clutter). Infrasonic signals along with seismic and hydroacoustic signals are subsequently associated with Global Association software (GA) between stations to locate events. During detection and association phases, criteria are applied to eliminate clutter, identify signals of interest, and keep the number of automatic events containing infrasound detections to a manageable level for analyst review. The IDC has developed analysis and visualization tools specifically for infrasound review (e.g. Geotool-PMCC). The IDC has continued to build the Infrasound Reference Event Database (IRED) from observations on the IMS network. This database assists both the routine IDC infrasound analysis and analyst training as it reflects the global detection capability of the network, illustrates the spatial and temporal variability of the observed phenomena, and demonstrates the various origins of infragenic sources. Since 2007, the IDC has introduced new analyst procedures to review and add selected infrasound events to the REB. In early 2010, the IDC

  1. Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.

  2. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  3. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  4. [Dealing with competing events in survival analysis].

    PubMed

    Béchade, Clémence; Lobbedez, Thierry

    2015-04-01

    Survival analyses focus on the occurrences of an event of interest, in order to determine risk factors and estimate a risk. Competing events prevent from observing the event of interest. If there are competing events, it can lead to a bias in the risk's estimation. The aim of this article is to explain why Cox model is not appropriate when there are competing events, and to present Fine and Gray model, which can help when dealing with competing risks.

  5. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  6. Sisyphus - An Event Log Analysis Toolset

    2004-09-01

    Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  8. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  9. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  10. Second-order analysis of semiparametric recurrent event processes.

    PubMed

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value.

  11. [Analysis of Spontaneously Reported Adverse Events].

    PubMed

    Nakamura, Mitsuhiro

    2016-01-01

    Observational study is necessary for the evaluation of drug effectiveness in clinical practice. In recent years, the use of spontaneous reporting systems (SRS) for adverse drug reactions has increased and they have become an important resource for regulatory science. SRS, being the largest and most well-known databases worldwide, are one of the primary tools used for postmarketing surveillance and pharmacovigilance. To analyze SRS, the US Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report Database (JADER) are reviewed. Authorized pharmacovigilance algorithms were used for signal detection, including the reporting odds ratio. An SRS is a passive reporting database and is therefore subject to numerous sources of selection bias, including overreporting, underreporting, and a lack of a denominator. Despite the inherent limitations of spontaneous reporting, SRS databases are a rich resource and data mining index that provide powerful means of identifying potential associations between drugs and their adverse effects. Our results, which are based on the evaluation of SRS databases, provide essential knowledge that could improve our understanding of clinical issues.

  12. Peak event analysis: a novel empirical method for the evaluation of elevated particulate events

    PubMed Central

    2013-01-01

    Background We report on a novel approach to the analysis of suspended particulate data in a rural setting in southern Ontario. Analyses of suspended particulate matter and associated air quality standards have conventionally focussed on 24-hour mean levels of total suspended particulates (TSP) and particulate matter <10 microns, <2.5 microns and <1 micron in diameter (PM10, PM2.5, PM1, respectively). Less emphasis has been placed on brief peaks in suspended particulate levels, which may pose a substantial nuisance, irritant, or health hazard. These events may also represent a common cause of public complaint and concern regarding air quality. Methods Measurements of TSP, PM10, PM2.5, and PM1 levels were taken using an automated device following local complaints of dusty conditions in rural south-central Ontario, Canada. The data consisted of 126,051 by-minute TSP, PM10, PM2.5, and PM1 measurements between May and August 2012. Two analyses were performed and compared. First, conventional descriptive statistics were computed by month for TSP, PM10, PM2.5, and PM1, including mean values and percentiles (70th, 90th, and 95th). Second, a novel graphical analysis method, using density curves and line plots, was conducted to examine peak events occurring at or above the 99th percentile of per-minute TSP readings. We refer to this method as “peak event analysis”. Findings of the novel method were compared with findings from the conventional approach. Results Conventional analyses revealed that mean levels of all categories of suspended particulates and suspended particulate diameter ratios conformed to existing air quality standards. Our novel methodology revealed extreme outlier events above the 99th percentile of readings, with peak PM10 and TSP levels over 20 and 100 times higher than the respective mean values. Peak event analysis revealed and described rare and extreme peak dust events that would not have been detected using conventional descriptive statistics

  13. PLANETARY AND OTHER SHORT BINARY MICROLENSING EVENTS FROM THE MOA SHORT-EVENT ANALYSIS

    SciTech Connect

    Bennett, D. P.; Sumi, T.; Bond, I. A.; Ling, C. H.; Kamiya, K.; Abe, F.; Fukui, A.; Furusawa, K.; Itow, Y.; Masuda, K.; Matsubara, Y.; Miyake, N.; Muraki, Y.; Botzler, C. S.; Rattenbury, N. J.; Korpela, A. V.; Sullivan, D. J.; Kilmartin, P. M.; Ohnishi, K.; Saito, To.; Collaboration: MOA Collaboration; and others

    2012-10-01

    We present the analysis of four candidate short-duration binary microlensing events from the 2006-2007 MOA Project short-event analysis. These events were discovered as a by-product of an analysis designed to find short-timescale single-lens events that may be due to free-floating planets. Three of these events are determined to be microlensing events, while the fourth is most likely caused by stellar variability. For each of the three microlensing events, the signal is almost entirely due to a brief caustic feature with little or no lensing attributable mainly to the lens primary. One of these events, MOA-bin-1, is due to a planet, and it is the first example of a planetary event in which the stellar host is only detected through binary microlensing effects. The mass ratio and separation are q (4.9 {+-} 1.4) Multiplication-Sign 10{sup -3} and s = 2.10 {+-} 0.05, respectively. A Bayesian analysis based on a standard Galactic model indicates that the planet, MOA-bin-1Lb, has a mass of m{sub p} = 3.7 {+-} 2.1 M{sub Jup} and orbits a star of M{sub *} = 0.75{sub -0.41}{sup +}0{sup .33} M{sub Sun} at a semimajor axis of a = 8.3{sub -2.7}{sup +4.5} AU. This is one of the most massive and widest separation planets found by microlensing. The scarcity of such wide-separation planets also has implications for interpretation of the isolated planetary mass objects found by this analysis. If we assume that we have been able to detect wide-separation planets with an efficiency at least as high as that for isolated planets, then we can set limits on the distribution of planets in wide orbits. In particular, if the entire isolated planet sample found by Sumi et al. consists of planets bound in wide orbits around stars, we find that it is likely that the median orbital semimajor axis is >30 AU.

  14. Glaciological parameters of disruptive event analysis

    SciTech Connect

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated.

  15. A reference manual for the Event Progression Analysis Code (EVNTRE)

    SciTech Connect

    Griesmeyer, J.M.; Smith, L.N.

    1989-09-01

    This document is a reference guide for the Event Progression Analysis (EVNTRE) code developed at Sandia National Laboratories. EVNTRE is designed to process the large accident progression event trees and associated files used in probabilistic risk analyses for nuclear power plants. However, the general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. The EVNTRE code efficiently processes large, complex event trees. It has the capability to assign probabilities to event tree branch points in several different ways, to classify pathways or outcomes into user-specified groupings, and to sample input distributions of probabilities and parameters.

  16. A Distributed Processing and Analysis System for Heliophysic Events

    NASA Astrophysics Data System (ADS)

    Hurlburt, N.; Cheung, M.; Bose, P.

    2008-12-01

    With several Virtual Observatories now under active development, the time is ripe to consider how they will interact to enable integrated studies that span the full range of Heliophysics. We present a solution that builds upon components of the Heliophysics Event Knowledgebase (HEK) being developed for the Solar Dynamics Observatory and the Heliophysics Event List Manager (HELMS), recently selected as part of the NASA VxO program. A Heliophysics Event Analysis and Processing System (HEAPS) could increase the scientific productivity of Heliophysics data by increasing the visibility of relevant events contained within them while decreasing the incremental costs of incorporating more events in research studies. Here we present the relevant precursors to such a system and show how it could operate within the Heliophysics Data Environment.

  17. Contingency Horizon: on Private Events and the Analysis of Behavior.

    PubMed

    Leigland, Sam

    2014-05-01

    Skinner's radical behaviorism incorporates private events as biologically based phenomena that may play a functional role with respect to other (overt) behavioral phenomena. Skinner proposed four types of contingencies, here collectively termed the contingency horizon, which enable certain functional relations between private events and verbal behavior. The adequacy and necessity of this position has met renewed challenges from Rachlin's teleological behaviorism and Baum's molar behaviorism, both of which argue that all "mental" phenomena and terminology may be explained by overt behavior and environment-behavior contingencies extended in time. A number of lines of evidence are presented in making a case for the functional characteristics of private events, including published research from behavior analysis and general experimental psychology, as well as verbal behavior from a participant in the debate. An integrated perspective is offered that involves a multiscaled analysis of interacting public behaviors and private events. PMID:27274956

  18. Nonlinear Analysis for Event Forewarning (NLAfEW)

    2013-05-23

    The NLAfEW computer code analyses noisy, experimental data to forewarn of adverse events. The functionality of the analysis is a follows: It removes artifacts from the data, converts the continuous data value to discrete values, constructs time-delay embedding vectors, comparents the unique nodes and links in one graph, and determines event forewarning on the basis of several successive occurrences of one (or more) of the dissimilarity measures above a threshold.

  19. Depth and source mechanism estimation for special event analysis, event screening, and regional calibration

    SciTech Connect

    Goldstein, P; Dodge, D; Ichinose, Rodgers, A; Bhattacharyya, B; Leach, R

    1999-07-23

    We have summarized the advantages and disadvantages of a variety of techniques for depth and mechanism estimation and suggest that significant work remains to be done for events with magnitudes of interest for test ban monitoring. We also describe a new, waveform modeling-based tool for fast and accurate, high-resolution depth and mechanism estimation. Significant features of this tool include its speed and accuracy and its applicability at relatively high frequencies. These features allow a user to rapidly determine accurate, high-resolution depth estimates and constraints on source mechanism for relatively small magnitude (mb-4.5) events. Based on the accuracy of depth estimates obtained with this tool, we conclude it is useful for both the analysis of unusual or suspect events and for event screening. We also find that this tool provides significant constraints on source mechanism and have used it to develop ''ground-truth'' estimates of depth and mechanism for a set of events in the Middle East and North Africa. These ''ground-truth'' depths and mechanisms should be useful for regional calibration.

  20. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Bouilloud, Ludovic; Delrieu, Guy; Boudevillain, Brice; Kirstetter, Pierre-Emmanuel

    2010-11-01

    SummaryA method to estimate rainfall from radar data for post-event analysis of flash-flood events has been developed within the EC-funded HYDRATE project. It follows a pragmatic approach including careful analysis of the observation conditions for the radar system(s) available for the considered case. Clutter and beam blockage are characterised by dry-weather observations and simulations based on a digital terrain model of the region of interest. The vertical profile of reflectivity (VPR) is either inferred from radar data if volume scanning data are available or simply defined using basic meteorological parameters (idealised VPR). Such information is then used to produce correction factor maps for each elevation angle to correct for range-dependent errors. In a second step, an effective Z-R relationship is optimised to remove the bias over the hit region. Due to limited data availability, the optimisation is carried out with reference to raingauge rain amounts measured at the event time scale. Sensitivity tests performed with two well-documented rain events show that a number of Z = aRb relationships, organised along hyperbolic curves in the (a and b) parameter space, lead to optimum assessment results in terms of the Nash coefficient between the radar and raingauge estimates. A refined analysis of these equifinality patterns shows that the “total additive conditional bias” can be used to discriminate between the Nash coefficient equifinal solutions. We observe that the optimisation results are sensitive to the VPR description and also that the Z-R optimisation procedure can largely compensate for range-dependent errors, although this shifts the optimal coefficients in the parameter space. The time-scale dependency of the equifinality patterns is significant, however near-optimal Z-R relationships can be obtained at all time scales from the event time step optimisation.

  1. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  2. External events analysis for the Savannah River Site K reactor

    SciTech Connect

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{sup {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.

  3. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    SciTech Connect

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K.

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  4. Heterogeneity and event dependence in the analysis of sickness absence

    PubMed Central

    2013-01-01

    Background Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. Methods To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. Results As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. Conclusions The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events, especially with large databases. PMID:24040880

  5. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied

  6. Physics analysis of the gang partial rod drive event

    SciTech Connect

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis.

  7. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    SciTech Connect

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-11-22

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility.

  8. Root Cause Analysis: Learning from Adverse Safety Events.

    PubMed

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. PMID:26466177

  9. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  10. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller. PMID:27398276

  11. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  12. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  13. Topological Analysis of Emerging Bipole Clusters Producing Violent Solar Events

    NASA Astrophysics Data System (ADS)

    Mandrini, C. H.; Schmieder, B.; Démoulin, P.; Guo, Y.; Cristiani, G. D.

    2014-06-01

    During the rising phase of Solar Cycle 24 tremendous activity occurred on the Sun with rapid and compact emergence of magnetic flux leading to bursts of flares (C to M and even X-class). We investigate the violent events occurring in the cluster of two active regions (ARs), NOAA numbers 11121 and 11123, observed in November 2010 with instruments onboard the Solar Dynamics Observatory and from Earth. Within one day the total magnetic flux increased by 70 % with the emergence of new groups of bipoles in AR 11123. From all the events on 11 November, we study, in particular, the ones starting at around 07:16 UT in GOES soft X-ray data and the brightenings preceding them. A magnetic-field topological analysis indicates the presence of null points, associated separatrices, and quasi-separatrix layers (QSLs) where magnetic reconnection is prone to occur. The presence of null points is confirmed by a linear and a non-linear force-free magnetic-field model. Their locations and general characteristics are similar in both modelling approaches, which supports their robustness. However, in order to explain the full extension of the analysed event brightenings, which are not restricted to the photospheric traces of the null separatrices, we compute the locations of QSLs. Based on this more complete topological analysis, we propose a scenario to explain the origin of a low-energy event preceding a filament eruption, which is accompanied by a two-ribbon flare, and a consecutive confined flare in AR 11123. The results of our topology computation can also explain the locations of flare ribbons in two other events, one preceding and one following the ones at 07:16 UT. Finally, this study provides further examples where flare-ribbon locations can be explained when compared to QSLs and only, partially, when using separatrices.

  14. Analysis of large Danube flood events at Vienna since 1700

    NASA Astrophysics Data System (ADS)

    Kiss, Andrea; Blöschl, Günter; Hohensinner, Severin; Perdigao, Rui

    2014-05-01

    Whereas Danube water level measurements are available in Vienna from 1820 onwards, documentary evidence plays a significant role in the long-term understanding of Danube hydrological processes. Based on contemporary documentary evidence and early instrumental measurements, in the present paper we aim to provide an overview and a hydrological analysis of major Danube flood events, and the changes occurred in flood behaviour in Vienna in the last 300 years. Historical flood events are discussed and analysed according to types, seasonality, frequency and magnitude. Concerning historical flood events we apply a classification of five-scaled indices that considers height, magnitude, length and impacts. The rich data coverage in Vienna, both in terms of documentary evidence and early instrumental measurements, provide us with the possibility to create a relatively long overlap between documentary evidence and instrumental measurements. This makes possible to evaluate and, to some extent, improve the index reconstruction. While detecting causes of changes in flood regime, we aim to provide an overview on the atmospheric background through some characteristic examples, selected great flood events (e.g. 1787). Moreover, we also seek for the answer for such questions as in what way early (pre-instrumental period) human impact such as water regulations and urban development changed flood behaviour in the town, and how much it might have an impact on flood classification.

  15. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  16. Empirical Green's function analysis of recent moderate events in California

    USGS Publications Warehouse

    Hough, S.E.

    2001-01-01

    I use seismic data from portable digital stations and the broadband Terrascope network in southern California to investigate radiated earthquake source spectra and discuss the results in light of previous studies on both static stress drop and apparent stress. Applying the empirical Green's function (EGF) method to two sets of M 4-6.1 events, I obtain deconvolved source-spectra estimates and corner frequencies. The results are consistent with an ??2 source model and constant Brune stress drop. However, consideration of the raw spectral shapes of the largest events provides evidence for a high-frequency decay more shallow than ??2. The intermediate (???f-1) slope cannot be explained plausibly with attenuation or site effects and is qualitatively consistent with a model incorporating directivity effects and a fractional stress-drop rupture process, as suggested by Haddon (1996). However, the results obtained in this study are not consistent with the model of Haddon (1996) in that the intermediate slope is not revealed with EGF analysis. This could reflect either bandwidth limitations inherent in EGF analysis or perhaps a rupture process that is not self-similar. I show that a model with an intermediate spectral decay can also reconcile the apparent discrepancy between the scaling of static stress drop and that of apparent stress drop for moderate-to-large events.

  17. Mixed-effects Poisson regression analysis of adverse event reports

    PubMed Central

    Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

    2008-01-01

    SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  18. A Dendrochronological Analysis of Mississippi River Flood Events

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  19. Detection of abnormal events via optical flow feature analysis.

    PubMed

    Wang, Tian; Snoussi, Hichem

    2015-03-24

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  20. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  1. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Analysis of cosmic-ray events with ALICE at LHC

    NASA Astrophysics Data System (ADS)

    Rodríguez Cahuantzi, M.

    2015-08-01

    ALICE is one of the four main experiments of the LHC at CERN. Located 40 meters underground, with 30 m of overburden rock, it can also operate to detect muons produced by cosmic-ray interactions in the atmosphere. An analysis of the data collected with cosmic-ray triggers from 2010 to 2013, corresponding to about 31 days of live time, is presented. Making use of the ability of the Time Projection Chamber (TPC) to track large numbers of charged particles, a special emphasis is given to the study of muon bundles, and in particular to events with high-muon density.

  4. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  5. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  6. An analysis of three nuclear events in P-Tunnel

    SciTech Connect

    Fourney, W.L.; Dick, R.D.; Taylor, S.R.; Weaver, T.A.

    1994-05-03

    This report examines experimental results obtained from three P Tunnel events -- Mission Cyber, Disko Elm, and Distant Zenith. The objective of the study was to determine if there were any differences in the explosive source coupling for the three events. It was felt that Mission Cyber might not have coupled well because the ground motions recorded for that event were much lower than expected based on experience from N Tunnel. Detailed examination of the physical and chemical properties of the tuff in the vicinity of each explosion indicated only minor differences. In general, the core samples are strong and competent out to at least 60 m from each working point. Qualitative measures of core sample strength indicate that the strength of the tuff near Mission Cyber may be greater than indicated by results of static testing. Slight differences in mineralogic content and saturation of the Mission Cyber tuff were noted relative to the other two tests, but probably would not result in large differences in ground motions. Examination of scaled free-field stress and acceleration records collected by Sandia National Laboratory (SNL) indicated that Disko Elm showed the least scatter and Distant Zenith the most scatter. Mission Cyber measurements tend to lie slightly below those of Distant Zenith, but still within two standard deviations. Analysis of regional seismic data from networks operated by Lawrence Livermore National Laboratory (LLNL) and SNL also show no evidence of Mission Cyber coupling low relative to the other two events. The overall conclusion drawn from the study is that there were no basic differences in the way that the explosions coupled to the rock.

  7. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  8. Mining for adverse drug events with formal concept analysis.

    PubMed

    Estacio-Moreno, Alexander; Toussaint, Yannick; Bousquet, Cédric

    2008-01-01

    The pharmacovigilance databases consist of several case reports involving drugs and adverse events (AEs). Some methods are applied consistently to highlight all signals, i.e. all statistically significant associations between a drug and an AE. These methods are appropriate for verification of more complex relationships involving one or several drug(s) and AE(s) (e.g; syndromes or interactions) but do not address the identification of them. We propose a method for the extraction of these relationships based on Formal Concept Analysis (FCA) associated with disproportionality measures. This method identifies all sets of drugs and AEs which are potential signals, syndromes or interactions. Compared to a previous experience of disproportionality analysis without FCA, the addition of FCA was more efficient for identifying false positives related to concomitant drugs. PMID:18487830

  9. Collective analysis of ORPS-reportable electrical events (June, 2005-August 2009)

    SciTech Connect

    Henins, Rita J; Hakonson - Hayes, Audrey C

    2010-01-01

    The analysis of LANL electrical events between June 30, 2005 and August 31, 2009 provides data that indicate some potential trends regarding ISM failure modes, activity types associated with reportable electrical events, and ORPS causal codes. This report discusses the identified potential trends for Shock events and compares attributes of the Shock events against Other Electrical events and overall ORPS-reportable events during the same time frame.

  10. Event-by-Event pseudorapidity fluctuation analysis: An outlook to multiplicity and phase space dependence

    NASA Astrophysics Data System (ADS)

    Bhoumik, Gopa; Bhattacharyya, Swarnapratim; Deb, Argha; Ghosh, Dipak

    2016-07-01

    A detailed study of Event-by-Event pseudorapidity fluctuation of the pions produced in 16O -AgBr interactions at 60A GeV and 32S -AgBr interactions at 200A GeV has been carried out in terms of φ , a variable defined as a measure of fluctuation. Non-zero φ values indicate the presence of strong correlation among the pions for both interactions. Multiplicity and rapidity dependence of the Event-by-Event pseudorapidity fluctuation has been investigated. A decrease of φ with average multiplicity and increase of the same variable with pseudorapidity width are observed. Decrease of φ with average multiplicity is concluded as the particle emission by several independent sources occurs for higher-multiplicity events. The increase in φ values with pseudorapidity width, taken around central rapidity, might hint towards the presence of long-range correlation and its dominance over short range one. We have compared our experimental results with Monte Carlo simulation generated assuming independent particle emission. Comparison shows that the source of correlation and fluctuation is the dynamics of the pion production process. We have also compared our results with events generated by FRITIOF code. Such events also show the presence of fluctuation and correlation; however they fail to replicate the experimental findings.

  11. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events.

    PubMed

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-04-22

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation.

  12. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    PubMed Central

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  13. Bayesian analysis for extreme climatic events: A review

    NASA Astrophysics Data System (ADS)

    Chu, Pao-Shin; Zhao, Xin

    2011-11-01

    This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non

  14. Bootstrap analysis of the single subject with event related potentials.

    PubMed

    Oruç, Ipek; Krigolson, Olav; Dalrymple, Kirsten; Nagamatsu, Lindsay S; Handy, Todd C; Barton, Jason J S

    2011-07-01

    Neural correlates of cognitive states in event-related potentials (ERPs) serve as markers for related cerebral processes. Although these are usually evaluated in subject groups, the ability to evaluate such markers statistically in single subjects is essential for case studies in neuropsychology. Here we investigated the use of a simple test based on nonparametric bootstrap confidence intervals for this purpose, by evaluating three different ERP phenomena: the face-selectivity of the N170, error-related negativity, and the P3 component in a Posner cueing paradigm. In each case, we compare single-subject analysis with statistical significance determined using bootstrap to conventional group analysis using analysis of variance (ANOVA). We found that the proportion of subjects who show a significant effect at the individual level based on bootstrap varied, being greatest for the N170 and least for the P3. Furthermore, it correlated with significance at the group level. We conclude that the bootstrap methodology can be a viable option for interpreting single-case ERP amplitude effects in the right setting, probably with well-defined stereotyped peaks that show robust differences at the group level, which may be more characteristic of early sensory components than late cognitive effects.

  15. Event Listener Analysis and Symbolic Execution for Testing GUI Applications

    NASA Astrophysics Data System (ADS)

    Ganov, Svetoslav; Killmar, Chip; Khurshid, Sarfraz; Perry, Dewayne E.

    Graphical User Interfaces (GUIs) are composed of virtual objects, widgets, which respond to events triggered by user actions. Therefore, test inputs for GUIs are event sequences that mimic user interaction. The nature of these sequences and the values for certain widgets, such as textboxes, causes a two-dimensional combinatorial explosion. In this paper we present Barad, a GUI testing framework that uniformly addresses event-flow and data-flow in GUI applications generating tests in the form of event sequences and data inputs. Barad tackles the two-dimensional combinatorial explosion by pruning regions of the event and data input space. For event sequence generation we consider only events with registered event listeners, thus pruning regions of the event input space. We introduce symbolic widgets which allow us to obtain an executable symbolic version of the GUI. By symbolically executing the chain of listeners registered for the events in a generated event sequence we obtain data inputs, thus pruning regions in the data input space. Barad generates fewer tests and improves branch and statement coverage compared to traditional GUI testing techniques.

  16. The Tunguska event and Cheko lake origin: dendrochronological analysis

    NASA Astrophysics Data System (ADS)

    Rosanna, Fantucci; Romano, Serra; Gunther, Kletetschka; Mario, Di Martino

    2015-07-01

    Dendrochronological research was carried out on 23 trees samples (Larix sibirica and Picea obovata) sampled during the 1999 expedition in two locations, close to the epicentre zone and near Cheko lake (N 60°57', E 101°51'). Basal Area Increment (BAI) analysis has shown a general long growth suppression before 1908, the year of Tunguska event (TE), followed by a sudden growth increase due to diminished competition of trees that died due to the event. In one group of the trees, we detected growth decrease for several years (due to damage to the trunk, branches and crown), followed by growth increase during the following 4-14 years. We show that trees that germinated after the TE, and living in close proximity of Cheko lake (Cheko lake trees) had different behaviour patterns when compared to those trees living further from Cheko lake, inside the forest (Forest trees). Cheko lake trees have shown a vigorous continuous growth increase. Forest trees have shown a vigorous growth during the first 10-30 years of age, followed by a period of suppressed growth. We interpret the suppressed growth by the re-established competition with the surroundings trees. Cheko lake pattern, however, is consistent with the formation of the lake at the time of TE. This observation supports the hypothesis that Cheko lake formation is due to a fragment originating during TE, creating a small impact crater into the permafrost and soft alluvial deposits of Kimku River plain. This is further supported by the fact that Cheko lake has an elliptical shape elongated towards the epicentre of TE.

  17. Event Detection and Spatial Analysis for Characterizing Extreme Precipitation

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Prabhat, M.; Byna, S.; Collins, W.; Wehner, M. F.

    2013-12-01

    Atmospheric Rivers (ARs) are large spatially coherent weather systems with high concentrations of elevated water vapor that often cause severe downpours and flooding over western coastal United States. With the availability of more atmospheric moisture in the future under global warming, we expect ARs to play an important role as a potential cause of extreme precipitation. We have recently developed TECA software for automatically identifying and tracking features in climate datasets. In particular, we are able to identify ARs that make landfall on the western coast of North America. This detection tool examines integrated water vapor field above a certain threshold and performs geometric analysis. Based on the detection procedure, we investigate impacts of ARs by exploring spatial extent of AR precipitation for CMIP5 simulations, and characterize spatial pattern of dependence for future projections under climate change within the framework of extreme value theory. The results show that AR events in RCP8.5 scenario (2076-2100) tend to produce heavier rainfall with higher frequency and longer duration than the events from historical run (1981-2005). Range of spatial dependence between extreme precipitations is concentrated on smaller localized area in California under the highest emission scenario than present day. Preliminary results are illustrated in Figure 1 and 2. Fig 1: Boxplot of annual max precipitation (left two) and max AR precipitation (right two) from GFDL-ESM2M during 25-year time period by station in California, US. Fig 2: Spatial dependence of max AR precipitation calculated from Station 4 (triangle) for historical run (left) and for future projections of RCP8.5 (right) from GFDL-ESM2M. Green and orange colors represent complete dependence and independence between two stations respectively.

  18. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  19. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  20. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  1. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  2. Adverse Drug Event Ontology: Gap Analysis for Clinical Surveillance Application.

    PubMed

    Adam, Terrence J; Wang, Jin

    2015-01-01

    Adverse drug event identification and management are an important patient safety problem given the potential for event prevention. Previous efforts to provide structured data methods for population level identification of adverse drug events have been established, but important gaps in coverage remain. ADE identification gaps contribute to suboptimal and inefficient event identification. To address the ADE identification problem, a gap assessment was completed with the creation of a proposed comprehensive ontology using a Minimal Clinical Data Set framework incorporating existing identification approaches, clinical literature and a large set of inpatient clinical data. The new ontology was developed and tested using the National Inpatient Sample database with the validation results demonstrating expanded ADE identification capacity. In addition, the newly proposed ontology elements are noted to have significant inpatient mortality, above median inpatient costs and a longer length of stay when compared to existing ADE ontology elements and patients without ADE exposure.

  3. Adverse Drug Event Ontology: Gap Analysis for Clinical Surveillance Application

    PubMed Central

    Adam, Terrence J.; Wang, Jin

    2015-01-01

    Adverse drug event identification and management are an important patient safety problem given the potential for event prevention. Previous efforts to provide structured data methods for population level identification of adverse drug events have been established, but important gaps in coverage remain. ADE identification gaps contribute to suboptimal and inefficient event identification. To address the ADE identification problem, a gap assessment was completed with the creation of a proposed comprehensive ontology using a Minimal Clinical Data Set framework incorporating existing identification approaches, clinical literature and a large set of inpatient clinical data. The new ontology was developed and tested using the National Inpatient Sample database with the validation results demonstrating expanded ADE identification capacity. In addition, the newly proposed ontology elements are noted to have significant inpatient mortality, above median inpatient costs and a longer length of stay when compared to existing ADE ontology elements and patients without ADE exposure. PMID:26306223

  4. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  5. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  6. An integrated system for hydrological analysis of flood events

    NASA Astrophysics Data System (ADS)

    Katsafados, Petros; Chalkias, Christos; Karymbalis, Efthymios; Gaki-Papanastassiou, Kalliopi; Mavromatidis, Elias; Papadopoulos, Anastasios

    2010-05-01

    The significant increase of extreme flood events during recent decades has led to an urgent social and economic demand for improve prediction and sustainable prevention. Remedial actions require accurate estimation of the spatiotemporal variability of runoff volume and local peaks, which can be analyzed through integrated simulation tools. Despite the fact that such advanced modeling systems allow the investigation of the dynamics controlling the behavior of those complex processes they can also be used as early warning systems. Moreover, simulation is assuming as the appropriate method to derive quantitative estimates of various atmospheric and hydrologic parameters especially in cases of absence reliable and accurate measurements of precipitation and flow rates. Such sophisticated techniques enable the flood risk assessment and improve the decision-making support on protection actions. This study presents an integrated system for the simulation of the essential atmospheric and soil parameters in the context of hydrological flood modeling. The system is consisted of two main cores: a numerical weather prediction model coupled with a geographical information system for the accurate simulation of groundwater advection and rainfall runoff estimation. Synoptic and mesoscale atmospheric motions are simulated with a non-hydrostatic limited area model on a very high resolution domain of integration. The model includes advanced schemes for the microphysics and the surface layer physics description as well as the longwave and sortwave radiation budget estimation. It is also fully coupled with a land-surface model in order to resolve the surface heat fluxes and the simulation of the air-land energy exchange processes. Detailed atmospheric and soil parameters derived from the atmospheric model are used as input data for the GIS-based runoff modeling. Geographical information system (GIS) technology is used for further hydrological analysis and estimation of direct

  7. Discrete-Time Survival Factor Mixture Analysis for Low-Frequency Recurrent Event Histories

    PubMed Central

    Masyn, Katherine E.

    2013-01-01

    In this article, the latent class analysis framework for modeling single event discrete-time survival data is extended to low-frequency recurrent event histories. A partial gap time model, parameterized as a restricted factor mixture model, is presented and illustrated using juvenile offending data. This model accommodates event-specific baseline hazard probabilities and covariate effects; event recurrences within a single time period; and accounts for within- and between-subject correlations of event times. This approach expands the family of latent variable survival models in a way that allows researchers to explicitly address questions about unobserved heterogeneity in the timing of events across the lifespan. PMID:24489519

  8. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  9. Analysis of cumulus solar irradiance reflectance (CSIR) events

    NASA Astrophysics Data System (ADS)

    Laird, John L.; Harshvardhan

    Clouds are extremely important with regard to the transfer of solar radiation at Earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When Sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using UVA and UVB pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Win -2 and 0.0169 Wm -2 were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of Sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed. C 1997 Elsevier Science B.V.

  10. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  11. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  12. Application of a temporal reasoning framework tool in analysis of medical device adverse events.

    PubMed

    Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.

  13. Analysis and RHBD technique of single event transients in PLLs

    NASA Astrophysics Data System (ADS)

    Zhiwei, Han; Liang, Wang; Suge, Yue; Bing, Han; Shougang, Du

    2015-11-01

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented.

  14. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  15. Analysis of the September 2010 Los Angeles Extreme Heating Event

    NASA Astrophysics Data System (ADS)

    King, K. C.; Kaplan, M. L.; Smith, C.; Tilley, J.

    2015-12-01

    The Southern California coastal region has a temperate climate, however, there are days with extreme heating where temperatures may reach above 37°C, stressing the region's power grid, leading to health issues, and creating environments susceptible to fires. These extreme localized heating events occur over a short period, from a few hours to one to two days and may or may not occur in conjunction with high winds. The Santa Ana winds are a well-studied example of this type of phenomena. On September 27, 2010, Los Angeles, CA (LA), reached a record maximum temperature of 45°C during an extreme heating event that was not a Santa Ana event. We analyzed the event using observations, reanalysis data, and mesoscale simulations with the Weather Research and Forecasting Model (WRF) to understand the mechanisms of extreme heating and provide guidance on forecasting similar events. On 26 September 2010, a large synoptic ridge overturned and broke over the midwestern United States (US), driving momentum and internal energy to the southwest. A large pool of hot air at mid-levels over the four-corners region also shifted west, moving into southern California by 26 September. This hot air resided over the LA basin, just above the surface, by 00 GMT on 27 September. At this time, the pressure gradient at low levels was weak. Based on WRF model and wind profiler/RASS observations, we propose that separate mountain-plains solenoids (MPS) occurred on both 26 and 27 of September. The MPS on 26 September moved the hot air into place just above the surface over the LA basin. Overnight, the hot air is trapped near the surface due to the action of gravity waves in conjunction with orographic density currents and remnant migrating solenoids that form over the mountains surrounding LA. When the MPS forms during the late morning on the 27th, the descending return branch flow plus surface sensible heating creates a mechanism to move the heat to the surface, leading to record temperatures.

  16. SEPServer Solar Energetic Particle event Catalogues at 1 AU based on STEREO recordings: selected solar cycle 24 SEP event analysis

    NASA Astrophysics Data System (ADS)

    Papaioannou, Athanasios; Malandraki, Olga E.; Dresing, Nina; Klein, Karl-Ludwig; Heber, Bernd; Vainio, Rami; Nindos, Alexander; Rodríguez-Gasén, Rosa; Klassen, Andreas; Gómez Herrero, Raúl; Vilmer, Nicole; Mewaldt, Richard A.

    2014-05-01

    STEREO (Solar TErrestrial RElations Observatory) recordings provide an unprecedented opportunity to identify the evolution of Solar Energetic Particles (SEPs) at different observing points in the heliosphere. In this work, two instruments onboard STEREO have been used in order to identify all SEP events observed within the deciding phase of solar cycle 23 and the rising phase of solar cycle 24 from 2007 to 2012, namely: the Low Energy Telescope (LET) and the Solar Electron Proton Telescope (SEPT). A scan over STEREO/LET protons within the energy range 6-10 MeV has been performed for each of the two STEREO spacecraft. Furthermore, parallel scanning of the STEREO/SEPT electrons in order to pinpoint the presence (or not) of an electron event has been performed in the energy range of 55-85 keV, for all of the aforementioned proton events, included in our lists. We provide the onset and peak times as well as the peak value of all events for both protons and electrons. Time-shifting analysis for near relativistic electrons leads to the inferred solar release time and to the relevant solar associations from radio spectrographs (Nançay Decametric Array; STEREO/WAVES) to GOES Soft X-rays and hard X-rays from RHESSI. The aforementioned information materializes the STEREO SEPServer catalogues that recently have been released to the scientific community. In order to demonstrate the exploitation of the STEREO catalogues, we then focus at the series of SEP events that were recorded onboard STEREO A & B as well as at L1 (ACE, SOHO) from March 4-14, 2012. We track the activity of active region (AR) 1429 during its passage from the East to the West which produced a number of intense solar flares and coronal mass ejections and we compare the magnetic connectivity of each spacecraft in association with the corresponding SEP signatures. During this period the longitudinal separation of the STEREO spacecraft was > 220 degrees, yet both of them recorded SEP events. These complex multi

  17. Analysis of Adverse Events in Identifying GPS Human Factors Issues

    NASA Technical Reports Server (NTRS)

    Adams, Catherine A.; Hwoschinsky, Peter V.; Adams, Richard J.

    2004-01-01

    The purpose of this study was to analyze GPS related adverse events such as accidents and incidents (A/I), Aviation Safety Reporting System (ASRS) reports and Pilots Deviations (PDs) to create a framework for developing a human factors risk awareness program. Although the occurrence of directly related GPS accidents is small the frequency of PDs and ASRS reports indicated there is a growing problem with situational awareness in terminal airspace related to different types of GPs operational issues. This paper addresses the findings of the preliminary research and a brief discussion of some of the literature on related GPS and automation issues.

  18. Analysis of event-related potentials (ERP) by damped sinusoids.

    PubMed

    Demiralp, T; Ademoglu, A; Istefanopulos, Y; Gülçür, H O

    1998-06-01

    Several researchers propose that event-related potentials (ERPs) can be explained by a superposition of transient oscillations at certain frequency bands in response to external or internal events. The transient nature of the ERP is more suitable to be modelled as a sum of damped sinusoids. These damped sinusoids can be completely characterized by four sets of parameters, namely the amplitude, the damping coefficient, the phase and the frequency. The Prony method is used to estimate these parameters. In this study, the long-latency auditory-evoked potentials (AEP) and the auditory oddball responses (P300) of 10 healthy subjects are analysed by this method. It is shown that the original waveforms can be reconstructed by summing a small number of damped sinusoids. This allows for a parsimonious representation of the ERPs. Furthermore, the method shows that the oddball target responses contain higher amplitude, slower delta and slower damped theta components than those of the AEPs. With this technique, we show that the differentiation of sensory and cognitive potentials are not inherent in their overall frequency content but in their frequency components at certain bands.

  19. Analysis of broadband seismograms from selected IASPEI events

    USGS Publications Warehouse

    Choy, G.L.; Engdahl, E.R.

    1987-01-01

    Broadband seismograms of body waves that are flat to displacement and velocity in the frequency range from 0.01 to 5.0 Hz can now be routinely obtained for most earthquakes of magnitude greater than about 5.5. These records are obtained either directly or through multichannel deconvolution of waveforms from digitally recording seismograph stations. In contrast to data from conventional narrowband seismographs, broadband records have sufficient frequency content to define the source-time functions of body waves, even for shallow events for which the source functions of direct and surface-reflected phases may overlap. Broadband seismograms for selected IASPEI events are systematically analysed to identify depth phases and the presence of subevents. The procedure results in improved estimates of focal depth, identification of subevents in complex earthquakes, and better resolution of focal mechanisms. We propose that it is now possible for reporting agencies, such as the National Earthquake Information Center, to use broadband digital waveforms routinely in the processing of earthquake data. ?? 1987.

  20. Analysis of Continuous Microseismic Recordings: Resonance Frequencies and Unconventional Events

    NASA Astrophysics Data System (ADS)

    Tary, J.; van der Baan, M.

    2012-12-01

    Hydrofracture experiments, where fluids and proppant are injected into reservoirs to create fractures and enhance oil recovery, are often monitored using microseismic recordings. The total stimulated volume is then estimated by the size of the cloud of induced micro-earthquakes. This implies that only brittle failure should occur inside reservoirs during the fracturing. Yet, this assumption may not be correct, as the total energy injected into the system is orders of magnitude larger than the total energy associated with brittle failure. Instead of using only triggered events, it has been shown recently that the frequency content of continuous recordings may also provide information on the deformations occurring inside reservoirs. Here, we use different kinds of time-frequency transforms to track the presence of resonance frequencies. We analyze different data sets using regular, long-period and broadband geophones. The resonance frequencies observed are mainly included in the frequency band of 5-60 Hz. We systematically examine first the possible causes of resonance frequencies, dividing them into source, path and receiver effects. We then conclude that some of the observed frequency bands likely result from source effects. The resonance frequencies could be produced by either interconnected fluid-filled fractures in the order of tens of meters, or by small repetitive events occurring at a characteristic periodicity. Still, other mechanisms may occur or be predominant during reservoir fracturing, depending on the lithology as well as the pressure and temperature conditions at depth. During one experiment, both regular micro-earthquakes, long-period long-duration events (LPLD) and resonance frequencies are observed. The lower part of the frequency band of these resonance frequencies (5-30 Hz) overlaps with the anticipated frequencies of observed LPLDs in other experiments (<50 Hz). The exact origin of both resonance frequencies and LPLDs is still under debate

  1. Data integration and analysis using the Heliophysics Event Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal; Reardon, Kevin

    The Heliophysics Event Knowledgebase (HEK) system provides an integrated framework for automated data mining using a variety of feature-detection methods; high-performance data systems to cope with over 1TB/day of multi-mission data; and web services and clients for searching the resulting metadata, reviewing results, and efficiently accessing the data products. We have recently enhanced the capabilities of the HEK to support the complex datasets being produced by the Interface Region Imaging Spectrograph (IRIS). We are also developing the mechanisms to incorporate descriptions of coordinated observations from ground-based facilities, including the NSO's Dunn Solar Telescope (DST). We will discuss the system and its recent evolution and demonstrate its ability to support coordinated science investigations.

  2. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  3. Transcriptome analysis reveals differential splicing events in IPF lung tissue.

    PubMed

    Nance, Tracy; Smith, Kevin S; Anaya, Vanessa; Richardson, Rhea; Ho, Lawrence; Pala, Mauro; Mostafavi, Sara; Battle, Alexis; Feghali-Bostwick, Carol; Rosen, Glenn; Montgomery, Stephen B

    2014-01-01

    Idiopathic pulmonary fibrosis (IPF) is a complex disease in which a multitude of proteins and networks are disrupted. Interrogation of the transcriptome through RNA sequencing (RNA-Seq) enables the determination of genes whose differential expression is most significant in IPF, as well as the detection of alternative splicing events which are not easily observed with traditional microarray experiments. We sequenced messenger RNA from 8 IPF lung samples and 7 healthy controls on an Illumina HiSeq 2000, and found evidence for substantial differential gene expression and differential splicing. 873 genes were differentially expressed in IPF (FDR<5%), and 440 unique genes had significant differential splicing events in at least one exonic region (FDR<5%). We used qPCR to validate the differential exon usage in the second and third most significant exonic regions, in the genes COL6A3 (RNA-Seq adjusted pval = 7.18e-10) and POSTN (RNA-Seq adjusted pval = 2.06e-09), which encode the extracellular matrix proteins collagen alpha-3(VI) and periostin. The increased gene-level expression of periostin has been associated with IPF and its clinical progression, but its differential splicing has not been studied in the context of this disease. Our results suggest that alternative splicing of these and other genes may be involved in the pathogenesis of IPF. We have developed an interactive web application which allows users to explore the results of our RNA-Seq experiment, as well as those of two previously published microarray experiments, and we hope that this will serve as a resource for future investigations of gene regulation in IPF.

  4. Transcriptome analysis reveals differential splicing events in IPF lung tissue.

    PubMed

    Nance, Tracy; Smith, Kevin S; Anaya, Vanessa; Richardson, Rhea; Ho, Lawrence; Pala, Mauro; Mostafavi, Sara; Battle, Alexis; Feghali-Bostwick, Carol; Rosen, Glenn; Montgomery, Stephen B

    2014-01-01

    Idiopathic pulmonary fibrosis (IPF) is a complex disease in which a multitude of proteins and networks are disrupted. Interrogation of the transcriptome through RNA sequencing (RNA-Seq) enables the determination of genes whose differential expression is most significant in IPF, as well as the detection of alternative splicing events which are not easily observed with traditional microarray experiments. We sequenced messenger RNA from 8 IPF lung samples and 7 healthy controls on an Illumina HiSeq 2000, and found evidence for substantial differential gene expression and differential splicing. 873 genes were differentially expressed in IPF (FDR<5%), and 440 unique genes had significant differential splicing events in at least one exonic region (FDR<5%). We used qPCR to validate the differential exon usage in the second and third most significant exonic regions, in the genes COL6A3 (RNA-Seq adjusted pval = 7.18e-10) and POSTN (RNA-Seq adjusted pval = 2.06e-09), which encode the extracellular matrix proteins collagen alpha-3(VI) and periostin. The increased gene-level expression of periostin has been associated with IPF and its clinical progression, but its differential splicing has not been studied in the context of this disease. Our results suggest that alternative splicing of these and other genes may be involved in the pathogenesis of IPF. We have developed an interactive web application which allows users to explore the results of our RNA-Seq experiment, as well as those of two previously published microarray experiments, and we hope that this will serve as a resource for future investigations of gene regulation in IPF.

  5. Combined cardiotocographic and ST event analysis: A review.

    PubMed

    Amer-Wahlin, Isis; Kwee, Anneke

    2016-01-01

    ST-analysis of the fetal electrocardiogram (ECG) (STAN(®)) combined with cardiotocography (CTG) for intrapartum fetal monitoring has been developed following many years of animal research. Changes in the ST-segment of the fetal ECG correlated with fetal hypoxia occurring during labor. In 1993 the first randomized controlled trial (RCT), comparing CTG with CTG + ST-analysis was published. STAN(®) was introduced for daily practice in 2000. To date, six RCTs have been performed, out of which five have been published. Furthermore, there are six published meta-analyses. The meta-analyses showed that CTG + ST-analysis reduced the risks of vaginal operative delivery by about 10% and fetal blood sampling by 40%. There are conflicting results regarding the effect on metabolic acidosis, much because of controveries about which RCTs should be included in a meta-analysis, and because of differences in methodology, execution and quality of the meta-analyses. Several cohort studies have been published, some showing significant decrease of metabolic acidosis after the introduction of ST-analysis. In this review, we discuss not only the scientific evidence from the RCTs and meta-analyses, but also the limitations of these studies. In conclusion, ST-analysis is effective in reducing operative vaginal deliveries and fetal blood sampling but the effect on neonatal metabolic acidosis is still under debate. Further research is needed to determine the place of ST-analysis in the labor ward for daily practice.

  6. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  7. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  8. Analysis of sequential events in intestinal absorption of folylpolyglutamate

    SciTech Connect

    Darcy-Vrillon, B.; Selhub, J.; Rosenberg, I.H.

    1988-09-01

    Although it is clear that the intestinal absorption of folylpolyglutamates is associated with hydrolysis to monoglutamyl folate, the precise sequence and relative velocity of the events involved in this absorption are not fully elucidated. In the present study, we used biosynthetic, radiolabeled folylpolyglutamates purified by affinity chromatography to analyze the relationship of hydrolysis and transport in rat jejunal loops in vivo. Absorption was best described by a series of first-order processes: luminal hydrolysis to monoglutamyl folate followed by tissue uptake of the product. The rate of hydrolysis in vivo was twice as high as the rate of transport. The latter value was identical to that measured for folic acid administered separately. The relevance of this sequential model was confirmed by data obtained using inhibitors of the individual steps in absorption of ''natural'' folate. Heparin and sulfasalazine were both effective in decreasing absorption. The former affected hydrolysis solely, whereas the latter acted as a competitive inhibitor of transport of monoglutamyl folate. These studies confirm that hydrolysis is obligatory and that the product is subsequently taken up by a transport process, common to monoglutamyl folates, that is the rate-determining step in transepithelial absorption.

  9. Genome-Wide Analysis of Polyadenylation Events in Schmidtea mediterranea

    PubMed Central

    Lakshmanan, Vairavan; Bansal, Dhiru; Kulkarni, Jahnavi; Poduval, Deepak; Krishna, Srikar; Sasidharan, Vidyanand; Anand, Praveen; Seshasayee, Aswin; Palakodeti, Dasaradhi

    2016-01-01

    In eukaryotes, 3′ untranslated regions (UTRs) play important roles in regulating posttranscriptional gene expression. The 3′UTR is defined by regulated cleavage/polyadenylation of the pre-mRNA. The advent of next-generation sequencing technology has now enabled us to identify these events on a genome-wide scale. In this study, we used poly(A)-position profiling by sequencing (3P-Seq) to capture all poly(A) sites across the genome of the freshwater planarian, Schmidtea mediterranea, an ideal model system for exploring the process of regeneration and stem cell function. We identified the 3′UTRs for ∼14,000 transcripts and thus improved the existing gene annotations. We found 97 transcripts, which are polyadenylated within an internal exon, resulting in the shrinking of the ORF and loss of a predicted protein domain. Around 40% of the transcripts in planaria were alternatively polyadenylated (ApA), resulting either in an altered 3′UTR or a change in coding sequence. We identified specific ApA transcript isoforms that were subjected to miRNA mediated gene regulation using degradome sequencing. In this study, we also confirmed a tissue-specific expression pattern for alternate polyadenylated transcripts. The insights from this study highlight the potential role of ApA in regulating the gene expression essential for planarian regeneration. PMID:27489207

  10. Statistical analysis of geodetic networks for detecting regional events

    NASA Technical Reports Server (NTRS)

    Granat, Robert

    2004-01-01

    We present an application of hidden Markov models (HMMs) to analysis of geodetic time series in Southern California. Our model fitting method uses a regularized version of the deterministic annealing expectation-maximization algorithm to ensure that model solutions are both robust and of high quality.

  11. An analysis of fog events at Belgrade International Airport

    NASA Astrophysics Data System (ADS)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  12. Subjective well-being and adaptation to life events: a meta-analysis.

    PubMed

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E

    2012-03-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on affective and cognitive well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to 4 family events (marriage, divorce, bereavement, childbirth) and 4 work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given.

  13. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  14. Automatic Analysis of Radio Meteor Events Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Roman, Victor Ştefan; Buiu, Cătălin

    2015-12-01

    Meteor Scanning Algorithms (MESCAL) is a software application for automatic meteor detection from radio recordings, which uses self-organizing maps and feedforward multi-layered perceptrons. This paper aims to present the theoretical concepts behind this application and the main features of MESCAL, showcasing how radio recordings are handled, prepared for analysis, and used to train the aforementioned neural networks. The neural networks trained using MESCAL allow for valuable detection results, such as high correct detection rates and low false-positive rates, and at the same time offer new possibilities for improving the results.

  15. Human Reliability Analysis for Small Modular Reactors

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  16. Long-term Statistical Analysis of the Simultaneity of Forbush Decrease Events at Middle Latitudes

    NASA Astrophysics Data System (ADS)

    Lee, Seongsuk; Oh, Suyeon; Yi, Yu; Evenson, Paul; Jee, Geonhwa; Choi, Hwajin

    2015-03-01

    Forbush Decreases (FD) are transient, sudden reductions of cosmic ray (CR) intensity lasting a few days, to a week. Such events are observed globally using ground neutron monitors (NMs). Most studies of FD events indicate that an FD event is observed simultaneously at NM stations located all over the Earth. However, using statistical analysis, previous researchers verified that while FD events could occur simultaneously, in some cases, FD events could occur non-simultaneously. Previous studies confirmed the statistical reality of non-simultaneous FD events and the mechanism by which they occur, using data from high-latitude and middle-latitude NM stations. In this study, we used long-term data (1971-2006) from middle-latitude NM stations (Irkutsk, Climax, and Jungfraujoch) to enhance statistical reliability. According to the results from this analysis, the variation of cosmic ray intensity during the main phase, is larger (statistically significant) for simultaneous FD events, than for non-simultaneous ones. Moreover, the distribution of main-phase-onset time shows differences that are statistically significant. While the onset times for the simultaneous FDs are distributed evenly over 24- hour intervals (day and night), those of non-simultaneous FDs are mostly distributed over 12-hour intervals, in daytime. Thus, the existence of the two kinds of FD events, according to differences in their statistical properties, were verified based on data from middle-latitude NM stations.

  17. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  18. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    NASA Astrophysics Data System (ADS)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  19. Analysis of single events in ultrarelativistic nuclear collisions: A new method to search for critical fluctuations

    SciTech Connect

    Stock, R.

    1995-07-15

    The upcoming generation of experiments with ultrarelativistic heavy nuclear projectiles, at the CERN SPS and at RHIC and LHC, will confront researchers with several thousand identified hadrons per event, suitable detectors provided. An analysis of individual events becomes meaningful concerning a multitude of hadronic signals thought to reveal a transient deconfinement phase transition, or the related critical precursor fluctuations. Transverse momentum spectra, the kaon to pion ratio, and pionic Bose-Einstein correlation are examined, showing how to separate the extreme, probably rare candidate events from the bulk of average events. This type of observables can already be investigated with the Pb beam of the SPS. The author then discusses single event signals that add to the above at RHIC and LHC energies, kaon interferometry, rapidity fluctuation, jet and {gamma} production.

  20. Catchment process affecting drinking water quality, including the significance of rainfall events, using factor analysis and event mean concentrations.

    PubMed

    Cinque, Kathy; Jayasuriya, Niranjali

    2010-12-01

    To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.

  1. Addressing misallocation of variance in principal components analysis of event-related potentials.

    PubMed

    Dien, J

    1998-01-01

    Interpretation of evoked response potentials is complicated by the extensive superposition of multiple electrical events. The most common approach to disentangling these features is principal components analysis (PCA). Critics have demonstrated a number of caveats that complicate interpretation, notably misallocation of variance and latency jitter. This paper describes some further caveats to PCA as well as using simulations to evaluate three potential methods for addressing them: parallel analysis, oblique rotations, and spatial PCA. An improved simulation model is introduced for examining these issues. It is concluded that PCA is an essential statistical tool for event-related potential analysis, but only if applied appropriately.

  2. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.

  3. Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics.

    PubMed

    Sun, Yi; Rangan, Aaditya V; Zhou, Douglas; Cai, David

    2012-02-01

    We present an event tree analysis of studying the dynamics of the Hodgkin-Huxley (HH) neuronal networks. Our study relies on a coarse-grained projection to event trees and to the event chains that comprise these trees by using a statistical collection of spatial-temporal sequences of relevant physiological observables (such as sequences of spiking multiple neurons). This projection can retain information about network dynamics that covers multiple features, swiftly and robustly. We demonstrate that for even small differences in inputs, some dynamical regimes of HH networks contain sufficiently higher order statistics as reflected in event chains within the event tree analysis. Therefore, this analysis is effective in discriminating small differences in inputs. Moreover, we use event trees to analyze the results computed from an efficient library-based numerical method proposed in our previous work, where a pre-computed high resolution data library of typical neuronal trajectories during the interval of an action potential (spike) allows us to avoid resolving the spikes in detail. In this way, we can evolve the HH networks using time steps one order of magnitude larger than the typical time steps used for resolving the trajectories without the library, while achieving comparable statistical accuracy in terms of average firing rate and power spectra of voltage traces. Our numerical simulation results show that the library method is efficient in the sense that the results generated by using this numerical method with much larger time steps contain sufficiently high order statistical structure of firing events that are similar to the ones obtained using a regular HH solver. We use our event tree analysis to demonstrate these statistical similarities.

  4. Root-cause analysis of a potentially sentinel transfusion event: lessons for improvement of patient safety.

    PubMed

    Adibi, Hossein; Khalesi, Nader; Ravaghi, Hamid; Jafari, Mahdi; Jeddian, Ali Reza

    2012-01-01

    Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety. PMID:23165813

  5. Seasonality analysis of hydrological characteristics and flash flood events in Greece

    NASA Astrophysics Data System (ADS)

    Koutroulis, A. G.; Tsanis, I. K.

    2009-04-01

    The seasonality of flash flood occurrence is strongly connected to the climate forcing mechanisms of each region. Hydrological characteristics such as precipitation and stream flow depict the regional climate mechanisms. Comparison of daily and mean monthly seasonality of selected precipitation and runoff characteristics reveals valuable information within the context of flood occurrence. The present study presents the preliminary findings of the seasonality analysis of flash flood events that occurred in Greece during the 1925 - 2007 period in combination with a seasonality analysis of their hydrological characteristics. A two level approach at national (Greece) and regional (Crete Island) level was followed, using a total of 206 flood events. Twenty two of these flood events enriched the European Flash Flood database, which is being developed in the HYDRATE project. The analysis of hydrological characteristics through seasonality indices was based on a dataset of 83 monthly and daily precipitation stations and additionally 22 monthly and 15 daily flow stations. Analysis concludes that on the island of Crete, the flood event-based seasonality coincides with the seasonality of the daily precipitation maxima during December and January. The seasonality of the 3 largest long term daily precipitation maxima indicates that 50% of the maximum precipitation events occur during and the November -December - January (NDJ) period. The event based seasonality analysis for Greece indicated that 57% of the events occur during the NDJ period. The annual maximum daily precipitation is lagging behind by approximately one month to the maximum annual stream flows for Crete. This is due to the snow melting process, the low soil percolation rates of winter period and the high baseflow of the local karstic aquifers that contribute to the maximum flows. The results will be compared with six different hydrometeorological regions within Europe in the frame of HYDRATE project, in order to

  6. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  7. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, M.; Meier, T. M.; Becker, D.; Brüstle, A.

    2015-12-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The events were recorded by the temporary seismic network EGELADOS deployed from September 2005 to March 2007. The network covered the entire Hellenic subduction zone and it consisted of 23 offshore and 56 onshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. The cluster of intermediate deep seismicity consists of 159 events with local magnitudes ranging from magnitude 0.2 to magnitude 4.1 at depths from 80 to 200 km. The events occur close to the top of the slab at an about 30 km thick zone. The spatio-temporal clustering is studied using three component similarity analysis.Single event locations obtained using the nonlinear location tool NonLinLoc are compared to relative relocations calculated using the double-difference earthquake relocation software HypoDD. The relocation is performed with both manual readings of onset times as well as with differential traveltimes obtained by separate cross-correlation of P- and S-waveforms. The three-component waveform cross-correlation was performed for all the events using data from 45 stations. The results of the similarity analysis are shown as a function of frequency for individual stations and averaged over the network. Average similarities between waveforms of all event pairs reveal a low number of highly similar events but a large number of moderate similarities. Interestingly, the single station similarities between the event pairs show (1) in general decreasing similarity with increasing epicentral distance, (2) reduced similarities for paths crossing boundaries of slab segments, and (3) the influence of strong local heterogeneity leading to a considerable reduction of waveform similarities e.g. in the center of the Santorini volcano.

  8. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  9. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2016-04-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The cluster recorded during the deployment of the temporary seismic network EGELADOS consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. The network itself consisted of 56 onshore and 23 offshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. It was deployed from September 2005 to March 2007 and it covered the entire HSZ. Here, both spatial and temporal clustering of the recorded events is studied by using the three component similarity analysis. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The results are shown as a function of frequency for individual stations and as averaged values over the network. The cross-correlation coefficients at the single stations show a decreasing similarity with increasing epicentral distance as well as the effect of local heterogeneities at particular stations, causing noticeable differences in waveform similarities. Event relocation was performed by using the double-difference earthquake relocation software HypoDD and the results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential travel times obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, portion and occurrence time of the aftershocks is varied, it is shown that the event

  10. Statistical analysis of solar energetic particle events and related solar activity

    NASA Astrophysics Data System (ADS)

    Dierckxsens, Mark; Patsou, Ioanna; Tziotziou, Kostas; Marsh, Michael; Lygeros, Nik; Crosby, Norma; Dalla, Silvia; Malandraki, Olga

    2013-04-01

    The FP7 COMESEP (COronal Mass Ejections and Solar Energetic Particles: forecasting the space weather impact) project is developing tools for forecasting geomagnetic storms and solar energetic particle (SEP) radiation storms. Here we present preliminary results on a statistical analysis of SEP events and their parent solar activity during Solar Cycle 23. The work aims to identify correlations between solar events and SEP events relevant for space weather, as well as to quantify SEP event probabilities for use within the COMESEP alert system. The data sample covers the SOHO era and is based on the SEPEM reference event list [http://dev.sepem.oma.be/]. Events are subdivided if separate enhancements are observed in higher energy channels as defined for the list of Cane et al (2010). Energetic Storm Particle (ESP) enhancements during these events are identified by associating ESP-like increases in the proton channels with shocks detected in ACE and WIND data. Their contribution has been estimated and subtracted from the proton fluxes. Relationships are investigated between solar flare parameters such as X-ray intensity and heliographic location on the one hand, and the probability of occurrence and strength of energetic proton flux increases on the other hand. The same exercise is performed using the velocity and width of coronal mass ejections to examine their SEP productiveness. Relationships between solar event characteristics and SEP event spectral indices and fluences are also studied, as well as enhancements in heavy ion fluxes measured by the SIS instrument on board the ACE spacecraft during the same event periods. This work has received funding from the European Commission FP7 Project COMESEP (263252).

  11. Analysis of infrasonic and seismic events related to the 1998 Vulcanian eruption at Sakurajima

    NASA Astrophysics Data System (ADS)

    Morrissey, M.; Garces, M.; Ishihara, K.; Iguchi, M.

    2008-08-01

    We present results from a detailed analysis of seismic and infrasonic data recorded over a four day period prior to the Vulcanian eruptive event at Sakurajima volcano on May 19, 1998. Nearly one hundred seismic and infrasonic events were recorded on at least one of the nine seismic-infrasonic stations located within 3 km of the crater. Four unique seismic event types are recognized based on the spectral features of seismograms, including weak seismic tremor characterized by a 5-6 Hz peak mode that later shifted to 4-5 Hz. Long-period events are characterized by a short-duration, wide spectral band signal with an emergent, high-frequency onset followed by a wave coda lasting 15-20 s and a fundamental mode of 4.2-4.4 Hz. Values of Q for long-period events range between 10 and 22 suggesting that a gas-rich fluid was involved. Explosive events are the third seismic type, characterized by a narrow spectral band signal with an impulsive high-frequency onset followed by a 20-30 second wave coda and a peak mode of 4.0-4.4 Hz. Volcano-tectonic earthquakes are the fourth seismic type. Prior to May 19, 1998, only the tremor and explosion seismic events are found to have an infrasonic component. Like seismic tremor, infrasonic tremor is typically observed as a weak background signal. Explosive infrasonic events were recorded 10-15 s after the explosive seismic events and with audible explosions prior to May 19. On May 19, high-frequency impulsive infrasonic events occurred sporadically and as swarms within hours of the eruption. These infrasonic events are observed to be coincident with swarms of long-period seismic events. Video coverage during the seismic-infrasonic experiment recorded intermittent releases of gases and ash during times when seismic and acoustic events were recorded. The sequence of seismic and infrasonic events is interpreted as representing a gas-rich fluid moving through a series of cracks and conduits beneath the active summit crater.

  12. EVENT TREE ANALYSIS AT THE SAVANNAH RIVER SITE: A CASE HISTORY

    SciTech Connect

    Williams, R

    2009-05-25

    At the Savannah River Site (SRS), a Department of Energy (DOE) installation in west-central South Carolina there is a unique geologic stratum that exists at depth that has the potential to cause surface settlement resulting from a seismic event. In the past the particular stratum in question has been remediated via pressure grouting, however the benefits of remediation have always been debatable. Recently the SRS has attempted to frame the issue in terms of risk via an event tree or logic tree analysis. This paper describes that analysis, including the input data required.

  13. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    SciTech Connect

    Lisbeth A. Mitchell

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  14. Complete dose analysis of the November 12, 1960 solar cosmic ray event.

    PubMed

    Masley, A J; Goedeke, A D

    1963-01-01

    A detailed analysis of the November 12, 1960 solar cosmic ray event is presented as an integrated space flux and dose. This event is probably the most interesting solar cosmic ray event studied to date. Direct measurements were made of solar protons from 10 MeV to 6 GeV. During the double peaked high energy part of the event evidence is presented for the trapping of relativistic particles in a magnetic cloud. The proton energy spectrum is divided into 3 energy intervals, with separate energy power law exponents and time profiles carried through for each. The three groups are: (1) (30analysis are the results of rocket measurements which determined the spectrum down to 10 MeV twice during the event, balloon results from Fort Churchill and Minneapolis, earth satellite measurements, neutron monitors in New Hampshire and at both the North and South Pole and riometer results from Alaska and Kiruna, Sweden. The results are given in Table 1 [see text]. The results of our analyses of other solar cosmic ray events are also included with a general discussion of the solar flare hazards in space.

  15. Low time resolution analysis of polar ice cores cannot detect impulsive nitrate events

    NASA Astrophysics Data System (ADS)

    Smart, D. F.; Shea, M. A.; Melott, A. L.; Laird, C. M.

    2014-12-01

    Ice cores are archives of climate change and possibly large solar proton events (SPEs). Wolff et al. (2012) used a single event, a nitrate peak in the GISP2-H core, which McCracken et al. (2001a) time associated with the poorly quantified 1859 Carrington event, to discredit SPE-produced, impulsive nitrate deposition in polar ice. This is not the ideal test case. We critique the Wolff et al. analysis and demonstrate that the data they used cannot detect impulsive nitrate events because of resolution limitations. We suggest reexamination of the top of the Greenland ice sheet at key intervals over the last two millennia with attention to fine resolution and replicate sampling of multiple species. This will allow further insight into polar depositional processes on a subseasonal scale, including atmospheric sources, transport mechanisms to the ice sheet, postdepositional interactions, and a potential SPE association.

  16. Measurements and data analysis of suburban development impacts on runoff event characteristics and unit hydrographs

    NASA Astrophysics Data System (ADS)

    Sillanpää, Nora; Koivusalo, Harri

    2014-05-01

    Urbanisation strongly changes the catchment hydrological response to rainfall. Monitoring data on hydrological variables are most commonly available from rural and large areas, but less so from urban areas, and rarely from small catchments undergoing hydrological changes during the construction processes associated with urban development. Moreover, changes caused by urbanisation in the catchment hydrological response to snowmelt have not been widely studied. In this study, the changes occurring in runoff generation were monitored in a developing catchment under construction and in two urban control catchments. The developing catchment experienced extreme change from forest to a suburban residential area. The data used included rainfall and runoff observations from a five-year period (the years 2001-2006) with 2 to 10 minute temporal resolution. In total, 636 and 239 individual runoff events were investigated for summer and winter conditions, respectively. The changes occurring in runoff event characteristics such as event runoff volumes, peak flow rates, mean runoff intensities, and volumetric runoff coefficients were identified by the means of exploratory data analysis and nonparametric comparison tests (the Kruskall-Wallis and the Mann-Whitney tests). The effect of urbanization on event runoff dynamics was investigated using instantaneous unit hydrographs (IUH) based on a two-parameter gamma distribution. The measurements and data analyses demonstrated how the impact of urbanization on runoff was best detected based on peak flow rates, volumetric runoff coefficients, and mean runoff intensities. Control catchments were essential to distinguish the hydrological impact caused by catchment characteristics from those caused by changes in the meteorological conditions or season. As the imperviousness of the developing catchment increased from 1.5% to 37%, significant increases were observed in event runoff depths and peak flows during rainfall-runoff events. At the

  17. Two Point Autocorrelation Analysis of Auger Highest Energy Events Backtracked in Galactic Magnetic Field

    NASA Astrophysics Data System (ADS)

    Petrov, Yevgeniy

    2009-10-01

    Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.

  18. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. PMID:14640813

  19. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  20. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  1. Uniting Secondary and Postsecondary Education: An Event History Analysis of State Adoption of Dual Enrollment Policies

    ERIC Educational Resources Information Center

    Mokher, Christine G.; McLendon, Michael K.

    2009-01-01

    This study, as the first empirical test of P-16 policy antecedents, reports the findings from an event history analysis of the origins of state dual enrollment policies adopted between 1976 and 2005. First, what characteristics of states are associated with the adoption of these policies? Second, to what extent do conventional theories on policy…

  2. Rare events analysis of temperature chaos in the Sherrington-Kirkpatrick model

    NASA Astrophysics Data System (ADS)

    Billoire, Alain

    2014-04-01

    We investigate the question of temperature chaos in the Sherrington-Kirkpatrick spin glass model, applying a recently proposed rare events based data analysis method to existing Monte Carlo data. Thanks to this new method, temperature chaos is now observable for this model, even with the limited size systems that can currently be simulated.

  3. Systematic Quantitative Analysis of NO2 Long-Range Transport Events and Comparison to Model Data

    NASA Astrophysics Data System (ADS)

    Zien, A. W.; Richter, A.; Hilboll, A.; Burrows, J. P.; Inness, A.

    2012-12-01

    Atmospheric long-range transport (LRT) events relocate trace gases from emission to downwind regions on an intercontinental scale, drastically altering the atmospheric chemistry in remote regions. Tropospheric NO2 is a very short-lived, mainly anthropogenic trace gas with strong impact on the ozone chemistry. Emissions are very localized and allow identification of individual LRT events. In this study, we use non-cloud-filtered remote sensing observations from the GOME-2 satellite instrument to identify trans-oceanic NO2 LRT events. The LRT analysis is performed by a specialized algorithm, spotting anomalies in the vertical slant column data. LRT routes are obtained via Lagrangian back-tracing with the HYSPLIT model. We also implement a radiance cloud-fraction and model the NO2 air-mass factor in LRTs to allow the quantification of NO2 content under cloudy conditions which frequently accompany LRTs. Sample LRT events over the North Atlantic and the Southern Ocean illustrate the process and results of this analysis. The spatial and temporal coverage of the used observations also allows for a statistical analysis, showing regional and seasonal features in both LRT occurrence and properties in a 5-year data set (2007 to 2011), giving typical routes of transport. We compare these to results from a similar analysis of data from a global chemical transport model, the MACC reanalysis data.

  4. An Event History Analysis of Teacher Attrition: Salary, Teacher Tracking, and Socially Disadvantaged Schools

    ERIC Educational Resources Information Center

    Kelly, Sean

    2004-01-01

    In this event history analysis of the 1990-1991 Schools and Staffing Survey and the 1992 Teacher Follow-up Survey, a retrospective person-year database was constructed to examine teacher attrition over the course of the teaching career. Consistent with prior research, higher teacher salaries reduced attrition, but only slightly so. Teacher…

  5. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  6. Analysis of the longitudinal dependence of the downstream fluence of large solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Pacheco, Daniel; Sanahuja, Blai; Aran, Angels; Agueda, Neus; Jiggens, Piers

    2016-07-01

    Simulations of the solar energetic particle (SEP) intensity-time profiles are needed to estimate the radiation environment for interplanetary missions. At present, the physics-based models applied for such a purpose, and including a moving source of particles, are not able to model the portion of the SEP intensity enhancement occurring after the coronal/interplanetary shock crossing by the observer (a.k.a. the downstream region). This is the case, for example, of the shock-and-particle model used to build the SOLPENCO2 code. SOLPENCO2 provides the statistical modelling tool developed in the ESA/SEPEM project for interplanetary missions with synthetic SEP event simulations for virtual spacecraft located at heliocentric distances between 0.2 AU and 1.6 AU (http://dev.sepem.oma.be/). In this work we present an analysis of 168 individual SEP events observed at 1 AU from 1988 to 2013. We identify the solar eruptive phenomena associated with these SEP events, as well as the in-situ passage of interplanetary shocks. For each event, we quantify the amount of fluence accounted in the downstream region, i.e. after the passage of the shock, at the 11 SEPEM reference energy channels (i.e., from 5 to 300 MeV protons). First, from the subset of SEP events simultaneously detected by near Earth spacecraft (using SEPEM reference data) and by one of the STEREO spacecraft, we select those events for which the downstream region can be clearly determined. From the 8 selected multi-spacecraft events, we find that the western observations of each event have a minor downstream contribution than their eastern counterpart, and that the downstream-to-total fluence ratio of these events decreases as a function of the energy. Hence, there is a variation of the downstream fluence with the heliolongitude in SEP events. Based on this result, we study the variation of the downstream-to-total fluence ratios of the total set of individual events. We confirm the eastern-to-western decrease of the

  7. Characterization and Analysis of Networked Array of Sensors for Event Detection (CANARY-EDS)

    2011-05-27

    CANARY -EDS provides probabilistic event detection based on analysis of time-series data from water quality or other sensors. CANARY also can compare patterns against a library of previously seen data to indicate that a certain pattern has reoccurred, suppressing what would otherwise be considered an event. CANARY can be configured to analyze previously recorded data from files or databases, or it can be configured to run in real-time mode directory from a database, or throughmore » the US EPA EDDIES software.« less

  8. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  9. Early Events in Helix Unfolding Under External Forces: A Milestoning Analysis

    PubMed Central

    Kreuzer, Steven M; Elber, Ron; Moon, Tess J

    2012-01-01

    Initial events of helix breakage as a function of load are considered using Molecular Dynamics simulations and Milestoning analysis. A helix length of ~100 amino acids is considered as a model for typical helices found in molecular machines and as a model that minimizes end effects for early events of unfolding. Transitions of individual amino acids (averaged over the helix’s interior residues) are examined and its surrounding hydrogen bonds are considered. Dense kinetic networks are constructed that, with Milestoning analysis, provide the overall kinetics of early breakage events. Network analysis and selection of MaxFlux pathways illustrate that load impacts unfolding mechanisms in addition to time scales. At relatively high (100pN) load levels, the principal intermediate is the 310-helix, while at relatively low (10pN) levels the π-helix is significantly populated, albeit not as an unfolding intermediate. Coarse variables are examined at different levels of resolution; the rate of unfolding illustrates remarkable stability under changes in the coarsening. Consistent prediction of about ~5ns for the time of a single amino-acid unfolding event are obtained. Hydrogen bonds are much faster coarse variables (by about 2 orders of magnitude) compared to backbone torsional transition, which gates unfolding and thereby provides the appropriate coarse variable for the initiation of unfolding. PMID:22471347

  10. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  11. EEGIFT: Group Independent Component Analysis for Event-Related EEG Data

    PubMed Central

    Eichele, Tom; Rachakonda, Srinivas; Brakedal, Brage; Eikeland, Rune; Calhoun, Vince D.

    2011-01-01

    Independent component analysis (ICA) is a powerful method for source separation and has been used for decomposition of EEG, MRI, and concurrent EEG-fMRI data. ICA is not naturally suited to draw group inferences since it is a non-trivial problem to identify and order components across individuals. One solution to this problem is to create aggregate data containing observations from all subjects, estimate a single set of components and then back-reconstruct this in the individual data. Here, we describe such a group-level temporal ICA model for event related EEG. When used for EEG time series analysis, the accuracy of component detection and back-reconstruction with a group model is dependent on the degree of intra- and interindividual time and phase-locking of event related EEG processes. We illustrate this dependency in a group analysis of hybrid data consisting of three simulated event-related sources with varying degrees of latency jitter and variable topographies. Reconstruction accuracy was tested for temporal jitter 1, 2 and 3 times the FWHM of the sources for a number of algorithms. The results indicate that group ICA is adequate for decomposition of single trials with physiological jitter, and reconstructs event related sources with high accuracy. PMID:21747835

  12. Signature Based Detection of User Events for Post-mortem Forensic Analysis

    NASA Astrophysics Data System (ADS)

    James, Joshua Isaac; Gladyshev, Pavel; Zhu, Yuandong

    This paper introduces a novel approach to user event reconstruction by showing the practicality of generating and implementing signature-based analysis methods to reconstruct high-level user actions from a collection of low-level traces found during a post-mortem forensic analysis of a system. Traditional forensic analysis and the inferences an investigator normally makes when given digital evidence, are examined. It is then demonstrated that this natural process of inferring high-level events from low-level traces may be encoded using signature-matching techniques. Simple signatures using the defined method are created and applied for three popular Windows-based programs as a proof of concept.

  13. Scientific analysis within SEPServer: the 13 July 2005 SEP event case study

    NASA Astrophysics Data System (ADS)

    Malandraki, O. E.; Valtonen, E.; Agueda, N.; Papaioannou, A.; Klein, K.-L.; Heber, B.; Droege, W.; Aurass, H.; Nindos, A.; Vilmer, N.; Sanahuja, B.; Kouloumvakos, A.; Braune, S.; Preka-Papadema, P.; Tziotziou, K.; Hamadache, C.; Kiener, J.; Tatischeff, V.; Kartavykh, J.; Vainio, R.

    2012-04-01

    SEPServer is set out to make the first database of particle and corresponding EM observations of solar energetic particle (SEP) events over roughly three solar cycles. It will also provide users with results from the scientific analysis of multiple datasets using different observational and simulation based methods. Therefore, SEPServer will lead to new perspectives of scientific analysis and will serve as a new asset valuable for SEP and Space Weather research. In this contribution, the event of 13 July 2005 has been used as a case study, which is a proxy for the overall information that the SEPServer will include and at the same time it reveals the capabilities offered to the future users of SEPServer. The analysis of the 13 July 2005 event - focusing on the data driven analysis, i.e., onset and release time determination from SOHO/ERNE, SOHO/EPHIN and ACE/EPAM together with pitch angle distributions from ACE/EPAM, simulations based on WIND/3DP and ACE/EPAM electrons as well as direct comparison of the observed SEP fluxes with the associated electromagnetic emissions - is performed. The physical interpretation and the interconnection of the experimental and the simulation based results are discussed in detail. The 13 July 2005 case study exemplifies the future usage of SEPServer, which will provide a comprehensive and up to date SEP analysis service. Acknowledgements: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement No 262773 (SEPServer).

  14. Central Aortic Reservoir-Wave Analysis Improves Prediction of Cardiovascular Events in Elderly Hypertensives

    PubMed Central

    Narayan, Om; Davies, Justin E.; Hughes, Alun D.; Dart, Anthony M.; Parker, Kim H.; Reid, Christopher; Cameron, James D.

    2016-01-01

    Several morphological parameters based on the central aortic pressure waveform are proposed as cardiovascular risk markers, yet no study has definitively demonstrated the incremental value of any waveform parameter in addition to currently accepted biomarkers in elderly, hypertensive patients. The reservoir-wave concept combines elements of wave transmission and Windkessel models of arterial pressure generation, defining an excess pressure superimposed on a background reservoir pressure. The utility of pressure rate constants derived from reservoir-wave analysis in prediction of cardiovascular events is unknown. Carotid blood pressure waveforms were measured prerandomization in a subset of 838 patients in the Second Australian National Blood Pressure Study. Reservoir-wave analysis was performed and indices of arterial function, including the systolic and diastolic rate constants, were derived. Survival analysis was performed to determine the association between reservoir-wave parameters and cardiovascular events. The incremental utility of reservoir-wave parameters in addition to the Framingham Risk Score was assessed. Baseline values of the systolic rate constant were independently predictive of clinical outcome (hazard ratio, 0.33; 95% confidence interval, 0.13–0.82; P=0.016 for fatal and nonfatal stroke and myocardial infarction and hazard ratio, 0.38; 95% confidence interval, 0.20–0.74; P=0.004 for the composite end point, including all cardiovascular events). Addition of this parameter to the Framingham Risk Score was associated with an improvement in predictive accuracy for cardiovascular events as assessed by the integrated discrimination improvement and net reclassification improvement indices. This analysis demonstrates that baseline values of the systolic rate constant predict clinical outcomes in elderly patients with hypertension and incrementally improve prognostication of cardiovascular events. PMID:25534707

  15. Central aortic reservoir-wave analysis improves prediction of cardiovascular events in elderly hypertensives.

    PubMed

    Narayan, Om; Davies, Justin E; Hughes, Alun D; Dart, Anthony M; Parker, Kim H; Reid, Christopher; Cameron, James D

    2015-03-01

    Several morphological parameters based on the central aortic pressure waveform are proposed as cardiovascular risk markers, yet no study has definitively demonstrated the incremental value of any waveform parameter in addition to currently accepted biomarkers in elderly, hypertensive patients. The reservoir-wave concept combines elements of wave transmission and Windkessel models of arterial pressure generation, defining an excess pressure superimposed on a background reservoir pressure. The utility of pressure rate constants derived from reservoir-wave analysis in prediction of cardiovascular events is unknown. Carotid blood pressure waveforms were measured prerandomization in a subset of 838 patients in the Second Australian National Blood Pressure Study. Reservoir-wave analysis was performed and indices of arterial function, including the systolic and diastolic rate constants, were derived. Survival analysis was performed to determine the association between reservoir-wave parameters and cardiovascular events. The incremental utility of reservoir-wave parameters in addition to the Framingham Risk Score was assessed. Baseline values of the systolic rate constant were independently predictive of clinical outcome (hazard ratio, 0.33; 95% confidence interval, 0.13-0.82; P=0.016 for fatal and nonfatal stroke and myocardial infarction and hazard ratio, 0.38; 95% confidence interval, 0.20-0.74; P=0.004 for the composite end point, including all cardiovascular events). Addition of this parameter to the Framingham Risk Score was associated with an improvement in predictive accuracy for cardiovascular events as assessed by the integrated discrimination improvement and net reclassification improvement indices. This analysis demonstrates that baseline values of the systolic rate constant predict clinical outcomes in elderly patients with hypertension and incrementally improve prognostication of cardiovascular events.

  16. The use of significant event analysis and personal development plans in developing CPD: a pilot study.

    PubMed

    Wright, P D; Franklin, C D

    2007-07-14

    This paper describes the work undertaken by the Postgraduate Primary Care Trust (PCT) Dental Tutor for South Yorkshire and East Midlands Regional Postgraduate Dental Education Office during the first year of a two-year pilot. The tutor has special responsibility for facilitating the writing of Personal Development Plans (PDPs) and the introduction of Significant Event Analysis to the 202 general dental practitioners in the four Sheffield PCTs. Data were collected on significant events and the educational needs highlighted as a result. A hands-on workshop format was used in small practice groups and 45% of Sheffield general dental practitioners now have written PDPs compared with a 16% national average. A library of significant events has also been collated from the data collected. PMID:17632491

  17. A canonical correlation analysis based method for contamination event detection in water sources.

    PubMed

    Li, Ruonan; Liu, Shuming; Smith, Kate; Che, Han

    2016-06-15

    In this study, a general framework integrating a data-driven estimation model is employed for contamination event detection in water sources. Sequential canonical correlation coefficients are updated in the model using multivariate water quality time series. The proposed method utilizes canonical correlation analysis for studying the interplay between two sets of water quality parameters. The model is assessed by precision, recall and F-measure. The proposed method is tested using data from a laboratory contaminant injection experiment. The proposed method could detect a contamination event 1 minute after the introduction of 1.600 mg l(-1) acrylamide solution. With optimized parameter values, the proposed method can correctly detect 97.50% of all contamination events with no false alarms. The robustness of the proposed method can be explained using the Bauer-Fike theorem. PMID:27264637

  18. Detecting deception in children: event familiarity affects criterion-based content analysis ratings.

    PubMed

    Pezdek, Kathy; Morrow, Anne; Blandon-Gitlin, Iris; Goodman, Gail S; Quas, Jodi A; Saywitz, Karen J; Bidrose, Sue; Pipe, Margaret-Ellen; Rogers, Martha; Brodie, Laura

    2004-02-01

    Statement Validity Assessment (SVA) is a comprehensive credibility assessment system, with the Criterion-Based Content Analysis (CBCA) as a core component. Worldwide, the CBCA is reported to be the most widely used veracity assessment instrument. We tested and confirmed the hypothesis that CBCA scores are affected by event familiarity; descriptions of familiar events are more likely to be judged true than are descriptions of unfamiliar events. CBCA scores were applied to transcripts of 114 children who recalled a routine medical procedure (control) or a traumatic medical procedure that they had experienced one time (relatively unfamiliar) or multiple times (relatively familiar). CBCA scores were higher for children in the relatively familiar than the relatively unfamiliar condition, and CBCA scores were significantly correlated with age. Results raise serious questions regarding the forensic suitability of the CBCA for assessing the veracity of children's accounts.

  19. The Logic of Surveillance Guidelines: An Analysis of Vaccine Adverse Event Reports from an Ontological Perspective

    PubMed Central

    Courtot, Mélanie; Brinkman, Ryan R.; Ruttenberg, Alan

    2014-01-01

    Background When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. Methods and Findings Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. Conclusions By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. Availability Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero. PMID:24667848

  20. Adverse events in older patients undergoing ERCP: a systematic review and meta-analysis

    PubMed Central

    Day, Lukejohn W.; Lin, Lisa; Somsouk, Ma

    2014-01-01

    Background and study aims: Biliary and pancreatic diseases are common in the elderly; however, few studies have addressed the occurrence of adverse events in elderly patients undergoing endoscopic retrograde cholangiopancreatography (ERCP). Our objective was to determine the incidence rates of specific adverse events in this group and calculate incidence rate ratios (IRRs) for selected comparison groups. Patients and methods: Bibliographical searches were conducted in Medline, EMBASE, and Cochrane library databases. The studies included documented the incidence of adverse events (perforation, pancreatitis, bleeding, cholangitis, cardiopulmonary adverse events, mortality) in patients aged ≥ 65 who underwent ERCP. Pooled incidence rates were calculated for each reported adverse event and IRRs were determined for available comparison groups. A parallel analysis was performed in patients aged ≥ 80 and ≥ 90. Results: Our literature search yielded 7429 articles, of which 69 studies met our inclusion criteria. Pooled incidence rates for adverse events (per 1000 ERCPs) in patients aged ≥ 65 were as follows: perforation 3.8 (95 %CI 1.8 – 7.0), pancreatitis 13.1 (95 %CI 11.0 – 15.5), bleeding 7.7 (95 %CI 5.7 – 10.1), cholangitis 16.1 (95 %CI 11.7 – 21.7), cardiopulmonary events 3.7 (95 %CI 1.5 – 7.6), and death 7.1 (95 %CI 5.2 – 9.4). Patients ≥ 65 had lower rates of pancreatitis (IRR 0.3, 95 %CI 0.3 – 0.4) compared with younger patients. Octogenarians had higher rates of death (IRR 2.4, 95 %CI 1.3 – 4.5) compared with younger patients, whereas nonagenarians had increased rates of bleeding (IRR 2.4, 95 %CI 1.1 – 5.2), cardiopulmonary events (IRR 3.7, 95 %CI 1.0 – 13.9), and death (IRR 3.8, 95 %CI 1.0 – 14.4). Conclusions ERCP appears to be safe in elderly patients, except in the very elderly who are at higher risk of some adverse events. These data on adverse

  1. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  2. Analysis and visualization of single-trial event-related potentials.

    PubMed

    Jung, T P; Makeig, S; Westerfield, M; Townsend, J; Courchesne, E; Sejnowski, T J

    2001-11-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  3. FusionDB: a database for in-depth analysis of prokaryotic gene fusion events.

    PubMed

    Suhre, Karsten; Claverie, Jean-Michel

    2004-01-01

    FusionDB (http://igs-server.cnrs-mrs.fr/FusionDB/) constitutes a resource dedicated to in-depth analysis of bacterial and archaeal gene fusion events. Such events can provide the 'Rosetta stone' in the search for potential protein-protein interactions, as well as metabolic and regulatory networks. However, the false positive rate of this approach may be quite high, prompting a detailed scrutiny of putative gene fusion events. FusionDB readily provides much of the information required for that task. Moreover, FusionDB extends the notion of gene fusion from that of a single gene to that of a family of genes by assembling pairs of genes from different genomes that belong to the same Cluster of Orthogonal Groups (COG). Multiple sequence alignments and phylogenetic tree reconstruction for the N- and C-terminal parts of these 'COG fusion' events are provided to distinguish single and multiple fusion events from cases of gene fission, pseudogenes and other false positives. Finally, gene fusion events with matches to known structures of heterodimers in the Protein Data Bank (PDB) are identified and may be visualized. FusionDB is fully searchable with access to sequence and alignment data at all levels. A number of different scores are provided to easily differentiate 'real' from 'questionable' cases, especially when larger database searches are performed. FusionDB is cross-linked with the 'Phylogenomic Display of Bacterial Genes' (PhydBac) online web server. Together, these servers provide the complete set of information required for in-depth analysis of non-homology-based gene function attribution. PMID:14681411

  4. FusionDB: a database for in-depth analysis of prokaryotic gene fusion events.

    PubMed

    Suhre, Karsten; Claverie, Jean-Michel

    2004-01-01

    FusionDB (http://igs-server.cnrs-mrs.fr/FusionDB/) constitutes a resource dedicated to in-depth analysis of bacterial and archaeal gene fusion events. Such events can provide the 'Rosetta stone' in the search for potential protein-protein interactions, as well as metabolic and regulatory networks. However, the false positive rate of this approach may be quite high, prompting a detailed scrutiny of putative gene fusion events. FusionDB readily provides much of the information required for that task. Moreover, FusionDB extends the notion of gene fusion from that of a single gene to that of a family of genes by assembling pairs of genes from different genomes that belong to the same Cluster of Orthogonal Groups (COG). Multiple sequence alignments and phylogenetic tree reconstruction for the N- and C-terminal parts of these 'COG fusion' events are provided to distinguish single and multiple fusion events from cases of gene fission, pseudogenes and other false positives. Finally, gene fusion events with matches to known structures of heterodimers in the Protein Data Bank (PDB) are identified and may be visualized. FusionDB is fully searchable with access to sequence and alignment data at all levels. A number of different scores are provided to easily differentiate 'real' from 'questionable' cases, especially when larger database searches are performed. FusionDB is cross-linked with the 'Phylogenomic Display of Bacterial Genes' (PhydBac) online web server. Together, these servers provide the complete set of information required for in-depth analysis of non-homology-based gene function attribution.

  5. MACHO Project Analysis of the Galactic Bulge Microlensing Events with Clump Giants as Sources

    SciTech Connect

    Popowski, P; Vandehei, T; Griest, K; Alcock, C; Alves, D R; Allsman, R A; Axelrod, T S; Becker, A; Bennett, D P; Cook, K H; Freeman, K C; Geha, M; Lehner, M J; Marshall, S L; Minniti, D; Nelson, C; Peterson, B A; Quinn, P J; Stubbs, C W; Sutherland, W; Welch, D L

    2002-03-06

    We present preliminary results of the analysis of 5 years of MACHO data on the Galactic bulge microlensing events with clump giants as sources. This class of events allows one to obtain robust conclusions because relatively bright clump stars are not strongly affected by blending. We discuss: (1) the selection of ''giant'' events, (2) the distribution of event durations, (3) the anomalous character of event durations and optical depth in the MACHO field 104 centered on (l,b) = (3{sup o}.1,-3{sup o}.0). We report the preliminary average optical depth of {tau} = (2.0 {+-} 0.4) x10{sup -6} (internal) at (l,b) = (3{sup o}.9, -3{sup o}.8), and present a map of the spatial distribution of the optical depth. When field 104 is removed from the sample, the optical depth drops to {tau} = (1.4 {+-} 0.3) x 10{sup -6}, which is in excellent agreement with infrared-based models of the central Galactic region.

  6. Accuracy analysis of measurements on a stable power-law distributed series of events

    NASA Astrophysics Data System (ADS)

    Matthews, J. O.; Hopcraft, K. I.; Jakeman, E.; Siviour, G. B.

    2006-11-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation.

  7. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    PubMed

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events.

  8. Final Report for Dynamic Models for Causal Analysis of Panel Data. Dynamic Analysis of Event Histories. Part III, Chapter 1.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, examines sociological research methods for the study of change. The advantages and procedures for dynamic analysis of event-history data (data giving the number, timing, and sequence of changes in a categorical dependent variable) are considered. The authors argue for grounding…

  9. Using Simple Statistical Analysis of Historical Data to Understand Wind Ramp Events

    SciTech Connect

    Kamath, C

    2010-01-29

    As renewable resources start providing an increasingly larger percentage of our energy needs, we need to improve our understanding of these intermittent resources so we can manage them better. In the case of wind resources, large unscheduled changes in the energy output, called ramp events, make it challenging to keep the load and the generation balanced. In this report, we show that simple statistical analysis of the historical data on wind energy generation can provide insights into these ramp events. In particular, this analysis can help answer questions such as the time period during the day when these events are likely to occur, the relative severity of positive and negative ramps, and the frequency of their occurrence. As there are several ways in which ramp events can be defined and counted, we also conduct a detailed study comparing different options. Our results indicate that the statistics are relatively insensitive to these choices, but depend on utility-specific factors, such as the magnitude of the ramp and the time interval over which this change occurs. These factors reflect the challenges faced by schedulers and operators in keeping the load and generation balanced and can change over the years. We conduct our analysis using data from wind farms in the Tehachapi Pass region in Southern California and the Columbia Basin region in Northern Oregon; while the results for other regions are likely to be different, the report describes the benefits of conducting simple statistical analysis on wind generation data and the insights that can be gained through such analysis.

  10. The association of hypertriglyceridemia with cardiovascular events and pancreatitis: a systematic review and meta-analysis

    PubMed Central

    2012-01-01

    Background Hypertriglyceridemia may be associated with important complications. The aim of this study is to estimate the magnitude of association and quality of supporting evidence linking hypertriglyceridemia to cardiovascular events and pancreatitis. Methods We conducted a systematic review of multiple electronic bibliographic databases and subsequent meta-analysis using a random effects model. Studies eligible for this review followed patients longitudinally and evaluated quantitatively the association of fasting hypertriglyceridemia with the outcomes of interest. Reviewers working independently and in duplicate reviewed studies and extracted data. Results 35 studies provided data sufficient for meta-analysis. The quality of these observational studies was moderate to low with fair level of multivariable adjustments and adequate exposure and outcome ascertainment. Fasting hypertriglyceridemia was significantly associated with cardiovascular death (odds ratios (OR) 1.80; 95% confidence interval (CI) 1.31-2.49), cardiovascular events (OR, 1.37; 95% CI, 1.23-1.53), myocardial infarction (OR, 1.31; 95% CI, 1.15-1.49), and pancreatitis (OR, 3.96; 95% CI, 1.27-12.34, in one study only). The association with all-cause mortality was not statistically significant. Conclusions The current evidence suggests that fasting hypertriglyceridemia is associated with increased risk of cardiovascular death, MI, cardiovascular events, and possibly acute pancreatitis. Précis: hypertriglyceridemia is associated with increased risk of cardiovascular death, MI, cardiovascular events, and possibly acute pancreatitis PMID:22463676

  11. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  12. Ontology-based combinatorial comparative analysis of adverse events associated with killed and live influenza vaccines.

    PubMed

    Sarntivijai, Sirarat; Xiang, Zuoshuang; Shedden, Kerby A; Markel, Howard; Omenn, Gilbert S; Athey, Brian D; He, Yongqun

    2012-01-01

    Vaccine adverse events (VAEs) are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE) profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed) inactivated influenza vaccine (TIV) and trivalent live attenuated influenza vaccine (LAIV). Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS). Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone) and 3,707 AE reports for the only LAIV (FluMist). The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE). CODAE detects AEs using Proportional Reporting Ratio (PRR), Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports). These AE terms were classified using the Ontology of Adverse Events (OAE), MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis) present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these two

  13. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  14. Accounting for unintended binding events in the analysis of quartz crystal microbalance kinetic data.

    PubMed

    Heller, Gabriella T; Zwang, Theodore J; Sarapata, Elizabeth A; Haber, Michael A; Sazinsky, Matthew H; Radunskaya, Ami E; Johal, Malkiat S

    2014-05-01

    Previous methods for analyzing protein-ligand binding events using the quartz crystal microbalance with dissipation monitoring (QCM-D) fail to account for unintended binding that inevitably occurs during surface measurements and obscure kinetic information. In this article, we present a system of differential equations that accounts for both reversible and irreversible unintended interactions. This model is tested on three protein-ligand systems, each of which has different features, to establish the feasibility of using the QCM-D for protein binding analysis. Based on this analysis, we were able to obtain kinetic information for the intended interaction that is consistent with those obtained in literature via bulk-phase methods. In the appendix, we include a method for decoupling these from the intended binding events and extracting relevant affinity information.

  15. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  16. Identification and Analysis of Storm Tracks Associated with Extreme Flood Events in Southeast and South Brazil

    NASA Astrophysics Data System (ADS)

    Lima, Carlos; Lopes, Camila

    2015-04-01

    Flood is the main natural disaster in Brazil, practically affecting all regions in the country and causing several economical damages and losses of lives. In traditional hydrology, the study of floods is focused on a frequency analysis of the extreme events and on the fit of statistical models to define flood quantiles associated with pre-specified return periods or exceedance probabilities. The basic assumptions are randomness and temporal stationarity of the streamflow data. In this paper we seek to advance the traditional flood frequency studies by using the ideas developed in the area of flood hydroclimatology, which is defined as the study of climate in the flood framework, i.e., the understanding of long term changes in the frequency, magnitude, duration, location and seasonality of floods as driven by the interaction of regional and global patterns of the ocean and atmospheric circulation. That being said, flood events are not treated as random and stationary but resulting from a causal chain, where exceptional floods in water basins from different sizes are related with large scale anomalies in the atmospheric and ocean circulation patterns. Hence, such studies enrich the classical assumption of stationary flood hazard adopted in most flood frequency studies through a formal consideration of the physical mechanisms responsible for the generation of extreme floods, which implies recognizing the natural climate variability due to persistent and oscillatory regimes (e.g. ENSO, NAO, PDO) in many temporal scales (interannual, decadal, etc), and climate fluctuations in response to anthropogenic changes in the atmosphere, soil use and vegetation cover. Under this framework and based on streamflow gauge and reanalysis data, we identify and analyze here the storm tracks that preceded extreme events of floods in key flood-prone regions of the country (e.g. Parana and Rio Doce River basins) with such events defined based on the magnitude, duration and volume of the

  17. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  18. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  19. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance...

  20. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and...-0254 in the subject line of your comments. For additional instructions on submitting comments...

  1. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  2. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  3. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  4. Cardiovascular Events Following Smoke-Free Legislations: An Updated Systematic Review and Meta-Analysis

    PubMed Central

    Jones, Miranda R.; Barnoya, Joaquin; Stranges, Saverio; Losonczy, Lia; Navas-Acien, Ana

    2014-01-01

    Background Legislations banning smoking in indoor public places and workplaces are being implemented worldwide to protect the population from secondhand smoke exposure. Several studies have reported reductions in hospitalizations for acute coronary events following the enactment of smoke-free laws. Objective We set out to conduct a systematic review and meta-analysis of epidemiologic studies examining how legislations that ban smoking in indoor public places impact the risk of acute coronary events. Methods We searched MEDLINE, EMBASE, and relevant bibliographies including previous systematic reviews for studies that evaluated changes in acute coronary events, following implementation of smoke-free legislations. Studies were identified through December 2013. We pooled relative risk (RR) estimates for acute coronary events comparing post- vs. pre-legislation using inverse-variance weighted random-effects models. Results Thirty-one studies providing estimates for 47 locations were included. The legislations were implemented between 1991 and 2010. Following the enactment of smoke-free legislations, there was a 12 % reduction in hospitalizations for acute coronary events (pooled RR: 0.88, 95 % CI: 0.85–0.90). Reductions were 14 % in locations that implemented comprehensive legislations compared to an 8 % reduction in locations that only had partial restrictions. In locations with reductions in smoking prevalence post-legislation above the mean (2.1 % reduction) there was a 14 % reduction in events compared to 10 % in locations below the mean. The RRs for acute coronary events associated with enacting smoke-free legislation were 0.87 vs. 0.89 in locations with smoking prevalence pre-legislation above and below the mean (23.1 %), and 0.87 vs. 0.89 in studies from the Americas vs. other regions. Conclusion The implementation of smoke-free legislations was related to reductions in acute coronary event hospitalizations in most populations evaluated. Benefits are greater

  5. Detailed chronological analysis of microevolution events in herds infected persistently by Mycobacterium bovis.

    PubMed

    Navarro, Yurena; Romero, Beatriz; Bouza, Emilio; Domínguez, Lucas; de Juan, Lucía; García-de-Viedma, Darío

    2016-02-01

    Various studies have analyzed microevolution events leading to the emergence of clonal variants in human infections by Mycobacterium tuberculosis. However, microevolution events in animal tuberculosis remain unknown. We performed a systematic analysis of microevolution events in eight herds that were chronically infected by Mycobacterium bovis for more than 12 months. We analyzed 88 animals using a systematic screening procedure based on discriminatory MIRU-VNTR genotyping at sequential time points during the infection. Microevolution was detected in half of the herds studied. Emergence of clonal variants did not require long infection periods or a high number of infected animals in the herd. Microevolution was not restricted to strains from specific spoligotypes, and the subtle variations detected involved different MIRU loci. The genetic locations of the subtle genotypic variations recorded in the clonal variants indicated potential functional significance. This finding was consistent with the dynamics of some clonal variants, which outcompeted the original strains, suggesting an advantageous phenotype. Our data constitute a first step in defining the thresholds of variability to be tolerated in molecular epidemiology studies of M. bovis. We could therefore ensure that related clonal variants emerging as a result of microevolution events are not going to be misinterpreted as unrelated isolates.

  6. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  7. Urbanization and Fertility: An Event-History Analysis of Coastal Ghana

    PubMed Central

    WHITE, MICHAEL J.; MUHIDIN, SALUT; ANDRZEJEWSKI, CATHERINE; TAGOE, EVA; KNIGHT, RODNEY; REED, HOLLY

    2008-01-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself. Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  8. Characterization of Cell Lysis Events on a Microfluidic Device for High-Throughput Single Cell Analysis

    PubMed Central

    Hargis, Amy D; Alarie, JP; Ramsey, J.M.

    2012-01-01

    A microfluidic device capable of rapidly analyzing cells in a high-throughput fashion using electrical cell lysis is further characterized. In the experiments performed, cell lysis events were studied using an EMCCD camera with high frame rate (> 100 fps) data collection. It was found that, with this microfluidic design, the path that a cell follows through the electric field affects the amount of lysate injected into the analysis channel. Elimination of variable flow paths through the electric field was achieved by coating the analysis channel with a polyamine compound to reverse the electroosmotic flow (EOF). EOF reversal forced the cells to take the same path through the electric field. The improved control of the cell trajectory will reduce device-imposed bias on the analysis and maximizes the amount of lysate injected into the analysis channel for each cell, resulting in improved analyte detection capabilities. PMID:22025127

  9. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    PubMed

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus.

  10. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  11. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  12. Ensemble-based analysis of extreme precipitation events from 2007-2011

    NASA Astrophysics Data System (ADS)

    Lynch, Samantha

    From 2007 to 2011, 22 widespread, multiday rain events occurred across the United States. This study makes use of the European Centre for Medium-Range Weather Forecasts (ECMWF), the National Centers of Environmental Prediction (NCEP), and the United Kingdom Office of Meteorology (UKMET) ensemble prediction systems (EPS) in order to assess their forecast skill of these 22 widespread, precipitation events. Overall, the ECMWF had a skillful forecast for almost every event, with an exception of the 25-30 June 2007 event, the mesoscale convective vortex (MCV) over the southern plains of the United States. Additionally, the ECMWF EPS generally outperformed both the NCEP and UKMET EPS. To better evaluate the ECMWF, two widespread, multiday precipitation events were selected for closer examination: 29 April-4 May 2010 and 23-28 April 2011. The 29 April-4 May 2010 case study used ECMWF ensemble forecasts to explore the processes responsible for the development and maintenance of a multiday precipitation event that occurred in early May 2010, due to two successive quasi-stationary mesoscale convective systems. Locations in central Tennessee accumulated more than 483 millimeters (19 inches) of rain and the city of Nashville experienced a historic flash flood. Differences between ensemble members that correctly predicted heavy precipitation and those that did not were determined in order to determine the processes that were favorable or detrimental to the system's development. Statistical analysis was used to determine how synoptic-scale flows were correlated to area- averaged precipitation. For this particular case, the distribution of precipitation was found to be closely related to the strength of an upper-level trough in the central United States and an associated surface cyclone, with a weaker trough and cyclone being associated with more precipitation in the area of interest. The 23-28 April 2011 case study also used ECMWF ensemble forecasts to explore the processes

  13. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events

    PubMed Central

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic “GT-AG” rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  14. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  15. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables

    PubMed Central

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an “event of relation” with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  16. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables.

    PubMed

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an "event of relation" with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  17. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  18. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  19. Final Report for Dynamic Models for Causal Analysis of Panel Data. Approaches to the Censoring Problem in Analysis of Event Histories. Part III, Chapter 2.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…

  20. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    NASA Astrophysics Data System (ADS)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  1. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    SciTech Connect

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. )

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  3. Sensitivity to censored-at-random assumption in the analysis of time-to-event endpoints.

    PubMed

    Lipkovich, Ilya; Ratitch, Bohdana; O'Kelly, Michael

    2016-05-01

    Over the past years, significant progress has been made in developing statistically rigorous methods to implement clinically interpretable sensitivity analyses for assumptions about the missingness mechanism in clinical trials for continuous and (to a lesser extent) for binary or categorical endpoints. Studies with time-to-event outcomes have received much less attention. However, such studies can be similarly challenged with respect to the robustness and integrity of primary analysis conclusions when a substantial number of subjects withdraw from treatment prematurely prior to experiencing an event of interest. We discuss how the methods that are widely used for primary analyses of time-to-event outcomes could be extended in a clinically meaningful and interpretable way to stress-test the assumption of ignorable censoring. We focus on a 'tipping point' approach, the objective of which is to postulate sensitivity parameters with a clear clinical interpretation and to identify a setting of these parameters unfavorable enough towards the experimental treatment to nullify a conclusion that was favorable to that treatment. Robustness of primary analysis results can then be assessed based on clinical plausibility of the scenario represented by the tipping point. We study several approaches for conducting such analyses based on multiple imputation using parametric, semi-parametric, and non-parametric imputation models and evaluate their operating characteristics via simulation. We argue that these methods are valuable tools for sensitivity analyses of time-to-event data and conclude that the method based on piecewise exponential imputation model of survival has some advantages over other methods studied here. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26997353

  4. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  5. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  6. An event-related analysis of P300 by simultaneous EEG/fMRI

    NASA Astrophysics Data System (ADS)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  7. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model. PMID:24194116

  8. DNA sequence analysis of conserved genes reveals hybridization events that increase genetic diversity in Verticillium dahliae.

    PubMed

    Collado-Romero, Melania; Jiménez-Díaz, Rafael M; Mercado-Blanco, Jesús

    2010-01-01

    The hybrid origin of a Verticillium dahliae isolate belonging to the vegetative compatibility group (VCG) 3 is reported in this work. Moreover, new data supporting the hybrid origin of two V. dahliae var. longisporum (VDLSP) isolates are provided as well as information about putative parentals. Thus, isolates of VDLSP and V. dahliae VCG3 were found harboring multiple sequences of actin (Act), β-tubulin (β-tub), calmodulin (Cal) and histone 3 (H3) genes. Phylogenetic analysis of these sequences, the internal transcribed sequences (ITS-1 and ITS-2) of the rRNA genes and of a V. dahliae-specific sequence provided molecular evidences for the interspecific hybrid origin of those isolates. Sequence analysis suggests that some of VDLSP isolates may have resulted from hybridization events between a V. dahliae isolate of VCG1 and/or VCG4A and, probably, a closely related taxon to Verticillium alboatrum but not this one. Similarly, phylogenetic analysis and PCR markers indicated that a V. dahliae VCG3 isolate might have arisen from a hybridization event between a V. dahliae VCG1B isolate and as yet unidentified parent. This second parental probably does not belong to the Verticillium genus according to the gene sequences dissimilarities found between the VCG3 isolate and Verticillium spp. These results suggest an important role of parasexuality in diversity and evolution in the genus Verticillium and show that interspecific hybrids within this genus may not be rare in nature.

  9. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    NASA Astrophysics Data System (ADS)

    Wouters, J.; Bouchet, F.

    2016-09-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein–Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.

  10. Topology analysis of emerging bipole clusters producing violent solar events observed by SDO

    NASA Astrophysics Data System (ADS)

    Schmieder, Brigitte; Demoulin, Pascal; Mandrini, Cristina H.; Guo, Yang

    2012-07-01

    During the rising phase of Solar Cycle 24, tremendous magnetic solar activity occurs on the Sun with fast and compact emergence of magnetic flux leading to burts of flares (C to M and even X class) . We have investigated the violent events occuring in the cluster of two active regions AR 11121 and AR11123 observed in November by SDO. In less than two days the magnetic field increases by a factor of 10 with the emergence of groups of bipoles. A topology analysis demonstrates the formation of multiple separatrices and quasi-separatrix layers explaining possible mechanisms for destabilization of the magnetic structures such as filaments and coronal loops.

  11. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event.

    PubMed

    Bogaard, Glynis; Meijer, Ewout H; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  12. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event

    PubMed Central

    Bogaard, Glynis; Meijer, Ewout H.; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  13. High fold computer disk storage DATABASE for fast extended analysis of γ-rays events

    NASA Astrophysics Data System (ADS)

    Stézowski, O.; Finck, Ch.; Prévost, D.

    1999-03-01

    Recently spectacular technical developments have been achieved to increase the resolving power of large γ-ray spectrometers. With these new eyes, physicists are able to study the intricate nature of atomic nuclei. Concurrently more and more complex multidimensional analyses are needed to investigate very weak phenomena. In this article, we first present a software (DATABASE) allowing high fold coincidences γ-rays events to be stored on hard disk. Then, a non-conventional method of analysis, anti-gating procedure, is described. Two physical examples are given to explain how it can be used and Monte Carlo simulations have been performed to test the validity of this method.

  14. Recurrent event data analysis with intermittently observed time-varying covariates.

    PubMed

    Li, Shanshan; Sun, Yifei; Huang, Chiung-Yu; Follmann, Dean A; Krause, Richard

    2016-08-15

    Although recurrent event data analysis is a rapidly evolving area of research, rigorous studies on estimation of the effects of intermittently observed time-varying covariates on the risk of recurrent events have been lacking. Existing methods for analyzing recurrent event data usually require that the covariate processes are observed throughout the entire follow-up period. However, covariates are often observed periodically rather than continuously. We propose a novel semiparametric estimator for the regression parameters in the popular proportional rate model. The proposed estimator is based on an estimated score function where we kernel smooth the mean covariate process. We show that the proposed semiparametric estimator is asymptotically unbiased, normally distributed, and derives the asymptotic variance. Simulation studies are conducted to compare the performance of the proposed estimator and the simple methods carrying forward the last covariates. The different methods are applied to an observational study designed to assess the effect of group A streptococcus on pharyngitis among school children in India. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26887664

  15. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  16. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  17. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    SciTech Connect

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  18. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    NASA Astrophysics Data System (ADS)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  19. Post Tyrrhénian deformation analysis in the Sahel coast (Eastern Tunisia): seismotectonic events implication

    NASA Astrophysics Data System (ADS)

    Mejrei, H.; Ghribi, R.; Bouaziz, S.; Balescu, S.

    2012-04-01

    The eastern coast of Tunisia is characterized by Pleistocene coastal deposits considered as a reference of interglacial high sea levels. In this region, the stratigraphy of Tunisian Pleistocene deposits was first established on the basis of geomorphological, lithostratigraphic, biostratigraphic criteria and U/Th data. They have been subdivided into three superimposed formations, from the oldest to the recent "Douira, Rejiche and Chebba" including coastal marine (Strombus bubonius), lagoonal and eolian sediments. These marine formations are organized into parallel bars to the actual shoreline overlaying unconformably the Mio-Pliocene and "Villafranchian" deposits. A luminescence dating method IRSL applied to alkali feldspar grains from the two sandy marines units of the Douira formation demonstrate for the first time the presence of two successive interglacial high sea level events correlative of MIS 7 and MIS 9. These sandy marine units are separated by a major erosional surface and by a continental pedogenised loamy deposit related to a low sea level event which might be assigned to MIS 8. Variations in the height of these marine unit (+13 to +32m) in the Sahel coast reflect a significant tectonic deformations and show precious geomorphological and tectonic markers. An extensive brittle deformations analysis has been carried out in several sites. A detailed analysis of fracturing is based on studies of fault-slip data population and of joint sets. It allows reconstructions of post Tyrrhenian stress regimes which are characterized by N170-016 compression and N095-100 extension. In this paper we present, the combination of IRSL data applied to these raised marine deposits and a reconstruction of tectonic evolution in term of stress pattern evolution since the Tyrrhenian allowed us to assign an accurate the recent tectonic calendar. These reconstituted events will be replaced and will be discussed in the regional setting of sismotectonic activities of the north

  20. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    NASA Astrophysics Data System (ADS)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  1. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  2. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  3. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  4. Analysis of Loss-of-Offsite-Power Events 1998–2013

    SciTech Connect

    Schroeder, John Alton

    2015-02-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses suggest that loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2013. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The emergency diesel generator failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. No statistically significant trends in LOOP frequencies over the 1997–2013 period are identified. There is a possibility that a significant trend in grid-related LOOP frequency exists that is not easily detected by a simple analysis. Statistically significant increases in recovery times after grid- and switchyard-related LOOPs are identified.

  5. Modeling propensity to move after job change using event history analysis and temporal GIS

    NASA Astrophysics Data System (ADS)

    Vandersmissen, Marie-Hélène; Séguin, Anne-Marie; Thériault, Marius; Claramunt, Christophe

    2009-03-01

    The research presented in this paper analyzes the emergent residential behaviors of individual actors in a context of profound social changes in the work sphere. It incorporates a long-term view in the analysis of the relationships between social changes in the work sphere and these behaviors. The general hypothesis is that social changes produce complex changes in the long-term dynamics of residential location behavior. More precisely, the objective of this paper is to estimate the propensity for professional workers to move house after a change of workplace. Our analysis draws on data from a biographical survey using a retrospective questionnaire that enables a posteriori reconstitution of the familial, professional and residential lifelines of professional workers since their departure from their parents’ home. The survey was conducted in 1996 in the Quebec City Metropolitan Area, which, much like other Canadian cities, has experienced a substantial increase in “unstable” work, even for professionals. The approach is based on event history analysis, a Temporal Geographic Information System and exploratory spatial analysis of model’s residuals. Results indicate that 48.9% of respondents moved after a job change and that the most important factors influencing the propensity to move house after a job change are home tenure (for lone adults as for couple) and number of children (for couples only). We also found that moving is associated with changing neighborhood for owners while tenants or co-tenants tend to stay in the same neighborhood. The probability of moving 1 year after a job change is 0.10 for lone adults and couples while after 2 years, the household structure seems to have an impact: the probability increased to 0.23 for lone adults and to 0.21 for couples. The outcome of this research contributes to furthering our understanding of a familial decision (to move) following a professional event (change of job), controlling for household structure

  6. Emission Event Investigation With a Detailed Plume Tracking Process Analysis Tool

    NASA Astrophysics Data System (ADS)

    Henderson, B.; Vizuete, W.; Kim, B.; Jeffries, H.

    2006-12-01

    Houston Texas is the 4th largest city in the United States and consequently has emission sources typical of a large urban area. What makes Houston unique, however, is that in addition to these emission sources the city also hosts the highest concentration of petrochemical and refinery companies in the United States. Recently compiled data from these companies indicate that industrial emissions of volatile organic compounds (VOCs) originating from these facilities can have significant temporal variability. These facilities have reported releases, lasting from hours to days, whose magnitudes can be a factor of 10- 1000 over annual average emission rates. The majority of the reported events are typically one to a few hours in duration and are comprised mainly of ethylene, propylene, butylenes or 1,3 butadiene. Ambient monitors have shown that these releases result in rapid ozone (O3) formation causing exceedances of the National Ambient Air Quality Standard (NAAQS). In previous work conducted at the UNC-Chapel Hill and UT-Austin, we showed that 1-km resolution simulations were necessary to simulate the O3 production from these events. A simulated emission event at 1-km resolution and at 4-km resolution using the same meteorology (interpolated to 1-km) increased the O3 peak by 70 ppb at 1-km. Additionally, adding 69 tons of VOC to the emission inventory as a series of discrete emission events had the same O3 predictive accuracy as the TCEQ model, which added more than 1,000 additional tons. This investigation will use a Python-based modeling process analysis tool, pyPA, to investigate the cause of the 70 ppb O3 increase in the higher resolution case. pyPA quantifies radical budgets, source and fate of O3 precursors, and the physical and chemical processes that affect each species. This data is used to compare the affect of chemical reactions and physical transport rates in O3 production. The pyPA tool is especially well suited for this investigation as it also allows

  7. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment

  8. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  9. Quantitative fibrosis estimation by image analysis predicts development of decompensation, composite events and defines event-free survival in chronic hepatitis B patients.

    PubMed

    Bihari, Chhagan; Rastogi, Archana; Sen, Bijoya; Bhadoria, Ajeet Singh; Maiwall, Rakhi; Sarin, Shiv K

    2016-09-01

    The extent of fibrosis is a major determinant of the clinical outcome in patients with chronic liver diseases. We undertook this study to explore the degree of fibrosis in baseline liver biopsies to predict clinical outcomes in chronic hepatitis B (CHB) patients. Fibrosis quantification was done by image analysis on Masson's trichrome-stained sections and correlated with clinical and biochemical parameters, liver stiffness and hepatic vein pressure gradient (n = 96). Follow-up information collected related to clinical outcome. A total of 964 cases was analyzed. Median quantitative fibrosis (QF) was 3.7% (interquartile range, 1.6%-9.7%) with substantial variation in various stages. Median QF was F0, 1% (0.7%-1.65%); F1, 3.03% (2.07%-4.0%); F2, 7.1% (5.6%-8.7%); F3, 12.7% (10.15%-16.7%); F4, 26.9% (20.3%-36.4%). QF positively correlated with METAVIR staging, liver stiffness measurement, and hepatic vein pressure gradient. Eighty-nine cases developed liver-related events: decompensation, hepatocellular carcinoma, liver transplantation and death. Cox regression analysis after adjusting for METAVIR staging-QF, albumin, and AST for composite events; QF and albumin for decompensation; and only QF for hepatocellular carcinoma-were found to be significant predictors of clinical outcomes. QF categorized into five stages: QF1, 0%-5%; QF2, 5.1%-10%; QF3, 10.1%-15%; QF4, 15.1%-20%; QF5, >20.1%. In patients with advanced stages of QF, probability of event-free survival found to be low. Quantitative fibrosis in baseline liver biopsy predicts progression of the disease and disease outcome in CHB patients. QF defines the probability of event-free survival in CHB cases.

  10. Reinvestigation and analysis a landslide dam event in 2012 using UAV

    NASA Astrophysics Data System (ADS)

    Wang, Kuo-Lung; Huang, Zji-Jie; Lin, Jun-Tin

    2015-04-01

    Geological condition of Taiwan is fracture with locating on Pacific Rim seismic area. Typhoons usually attack during summer and steep mountains are highly weathered, which induces landslide in mountain area. The situation happens more frequently recent years due to weather change effect. Most landslides are very far away from residence area. Field investigation is time consuming, high budget, limited data collected and dangerous. Investigation with satellite images has disadvantages such as less of the actual situation and poor resolution. Thus the possibility for slope investigation with UAV will be proposed and discussed in this research. Hazard investigation and monitoring is adopted UAV in recent years. UAV has advantages such as light weight, small volume, high mobility, safe, easy maintenance and low cost. Investigation can be executed in high risk area. Use the mature aero photogrammetry , combines aero photos with control point. Digital surface model (DSM) and Ortho photos can be produced with control points aligned. The resolution can be less than 5cm thus can be used as temporal creeping monitoring before landslide happens. A large landslide site at 75k of road No. 14 was investigated in this research. Landslide happened in June, 2012 with heavy rainfall and landslide dam was formed quickly after that. Analysis of this landslide failure and mechanism were discussed in this research using DEMs produced prior this event with aero photos and after this event with UAV. Residual slope stability analysis is thus carried out after strength parameters obtain from analysis described above. Thus advice for following potential landslide conditions can be provided.

  11. Social network changes and life events across the life span: a meta-analysis.

    PubMed

    Wrzus, Cornelia; Hänel, Martha; Wagner, Jenny; Neyer, Franz J

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network changes and the effects of life events on social networks using 277 studies with 177,635 participants from adolescence to old age. Cross-sectional as well as longitudinal studies consistently showed that (a) the global social network increased up until young adulthood and then decreased steadily, (b) both the personal network and the friendship network decreased throughout adulthood, (c) the family network was stable in size from adolescence to old age, and (d) other networks with coworkers or neighbors were important only in specific age ranges. Studies focusing on life events that occur at specific ages, such as transition to parenthood, job entry, or widowhood, demonstrated network changes similar to such age-related network changes. Moderator analyses detected that the type of network assessment affected the reported size of global, personal, and family networks. Period effects on network sizes occurred for personal and friendship networks, which have decreased in size over the last 35 years. Together the findings are consistent with the view that a portion of normative, age-related social network changes are due to normative, age-related life events. We discuss how these patterns of normative social network development inform research in social, evolutionary, cultural, and personality psychology.

  12. Analysis of a snowfall event produced by mountains waves in Guadarrama Mountains (Spain)

    NASA Astrophysics Data System (ADS)

    Gascón, Estíbaliz; Sánchez, José Luis; Fernández-González, Sergio; Merino, Andrés; López, Laura; García-Ortega, Eduardo

    2014-05-01

    Heavy snowfall events are fairly uncommon precipitation processes in the Iberian Peninsula. When large amounts of snow accumulate in large cities with populations that are unaccustomed to or unprepared for heavy snow, these events have a major impact on their daily activities. On 16 January 2013, an extreme snowstorm occurred in Guadarrama Mountains (Madrid, Spain) during an experimental winter campaign as a part of the TECOAGUA Project. Strong northwesterly winds, high precipitation and temperatures close to 0°C were detected throughout the whole day. During this episode, it was possible to continuously take measurements of different variables involved in the development of the convection using a multichannel microwave radiometer (MMWR). The significant increase in the cloud thickness observed vertically by the MMWR and registered precipitation of 43 mm in 24 hours at the station of Navacerrada (Madrid) led us to consider that we were facing an episode of strong winter convection. Images from the Meteosat Second Generation (MSG) satellite suggested that the main source of the convection was the formation of mountain waves on the south face of the Guadarrama Mountains. The event was simulated in high resolution using the WRF mesoscale model, an analysis of which is based on the observational simulations and data. Finally, the continuous measurements obtained with the MMWR allowed us to monitor the vertical situation above the Guadarrama Mountains with temporal resolution of 2 minutes. This instrument has a clear advantage in monitoring short-term episodes of this kind in comparison to radiosondes, which usually produce data at 0000 and 1200 UTC. Acknowledgements This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22). The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2.

  13. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    NASA Astrophysics Data System (ADS)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  14. Analysis of Loss-of-Offsite-Power Events 1998–2012

    SciTech Connect

    T. E. Wierman

    2013-10-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses performed loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience from fiscal year 1998 through 2012. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The EDG failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. A statistically significant increase in industry performance was identified for plant-centered and switchyard-centered LOOP frequencies. There is no statistically significant trend in LOOP durations.

  15. Kickoff to Conflict: A Sequence Analysis of Intra-State Conflict-Preceding Event Structures

    PubMed Central

    D'Orazio, Vito; Yonamine, James E.

    2015-01-01

    While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998–2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace. PMID:25951105

  16. Comparing evapotranspiration partitioning after different types of rain events using stable isotopes and lagrangian dispersion analysis

    NASA Astrophysics Data System (ADS)

    Hogan, Patrick; Parajka, Juraj

    2016-04-01

    The eddy covariance method has become one of the standard methods for measuring evapotranspiration (ET) at the field scale, however it cannot separate transpiration from evaporation and it is also limited within plant canopies due to distortion of the turbulent wind fields. Possible solutions to these limitations include combining EC measurements made above the canopy coupled with either source/sink distribution models or stable isotope ET partitioning models. During the summer of 2014 the concentration and isotopic ratio of water vapour within the canopy of a growing maize field at the Hydrological Open Air Laboratory (HOAL) catchment was measured using a Picarro field sampling device. A tripod mounted eddy covariance device was used to calculate the ET value for the field. The first objective of this experiment is to compare the ET partitioning results made using the stable isotope Keeling Plot method within a canopy to two different lagrangian dispersion analysis methods, the Localised Near Field theory of Raupach (1989a) and the Warland and Thurtell (2000) dispersion model. Preliminary results show good agreement during dry conditions with the dispersion methods overestimating the fraction of transpiration directly after a rain event. The second objective is then to analyse and compare the soil evaporation response for two different kinds of rain events using the stable isotope results.

  17. Forecasting and nowcasting process: A case study analysis of severe precipitation event in Athens, Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis; Nastos, Panagiotis; Avgoustoglou, Euripides; Gofa, Flora; Pytharoulis, Ioannis; Kamberakis, Nikolaos

    2016-04-01

    An early warning process is the result of interplay between the forecasting and nowcasting interactions. Therefore, (1) an accurate measurement and prediction of the spatial and temporal distribution of rainfall over an area and (2) the efficient and appropriate description of the catchment properties are important issues in atmospheric hazards (severe precipitation, flood, flash flood, etc.). In this paper, a forecasting and nowcasting analysis is presented, regarding a severe precipitation event that took place on September 21, 2015 in Athens, Greece. The severe precipitation caused a flash flood event at the suburbs of Athens, with significant impacts to the local society. Quantitative precipitation forecasts from European Centre for Medium-Range Weather Forecasts and from the COSMO.GR atmospheric model, including ensemble forecast of precipitation and probabilistic approaches are analyzed as tools in forecasting process. Satellite remote sensing data close and six hours prior to flash flood are presented, accompanied with radar products from Hellenic National Meteorological Service, illustrating the ability to depict the convection process.

  18. Observations and Analysis of Mutual Events between the Uranus Main Satellites

    NASA Astrophysics Data System (ADS)

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; da Silva Neto, D. N.; Andrei, A. H.

    2009-04-01

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s-1, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites. Based on observations made at Laboratório Nacional de Astrofísica (LNA), Itajubá-MG, Brazil.

  19. Large solar energetic particle event that occurred on 2012 March 7 and its VDA analysis

    NASA Astrophysics Data System (ADS)

    Ding, Liu-Guan; Cao, Xin-Xin; Wang, Zhi-Wei; Le, Gui-Ming

    2016-08-01

    On 2012 March 7, the STEREO Ahead and Behind spacecraft, along with near-Earth spacecraft (e.g. SOHO, Wind) situated between the two STEREO spacecraft, observed an extremely large global solar energetic particle (SEP) event in Solar Cycle 24. Two successive coronal mass ejections (CMEs) have been detected close in time. From the multi-point in-situ observations, it can be found that this SEP event was caused by the first CME, but the second one was not involved. Using velocity dispersion analysis (VDA), we find that for a well magnetically connected point, the energetic protons and electrons are released nearly at the same time. The path lengths to STEREO-B (STB) for protons and electrons have a distinct difference and deviate remarkably from the nominal Parker spiral path length, which is likely due to the presence of interplanetary magnetic structures situated between the source and STB. Also, the VDA method seems to only obtain reasonable results at well-connected locations and the inferred release times of energetic particles in different energy channels are similar. We suggest that good-connection is crucial for obtaining both an accurate release time and path length simultaneously, agreeing with the modeling result of Wang & Qin (2015).

  20. OBSERVATIONS AND ANALYSIS OF MUTUAL EVENTS BETWEEN THE URANUS MAIN SATELLITES

    SciTech Connect

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; Da Silva Neto, D. N.; Andrei, A. H. E-mail: rvm@on.br

    2009-04-15

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s{sup -1}, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites.

  1. Voluntary electronic reporting of laboratory errors: an analysis of 37,532 laboratory event reports from 30 health care organizations.

    PubMed

    Snydman, Laura K; Harubin, Beth; Kumar, Sanjaya; Chen, Jack; Lopez, Robert E; Salem, Deeb N

    2012-01-01

    Laboratory testing is essential for diagnosis, evaluation, and management. The objective was to describe the type of laboratory events reported in hospitals using a voluntary electronic error reporting system (e-ERS) via a cross-sectional analysis of reported laboratory events from 30 health organizations throughout the United States (January 1, 2000, to December 31, 2005). A total of 37,532 laboratory-related events were reported, accounting for 14.1% of all reported quality events. Preanalytic laboratory events were the most common (81.1%); the top 3 were specimen not labeled (18.7%), specimen mislabeled (16.3%), and improper collection (13.2%). A small number (0.08%) of laboratory events caused permanent harm or death; 8% caused temporary harm. Most laboratory events (55%) did not cause harm. Laboratory errors constitute 1 of 7 quality events. Laboratory errors often are caused by events that precede specimen arrival in the lab and should be preventable with a better labeling processes and education. Most laboratory errors do not lead to patient harm.

  2. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  3. An analysis of extreme flood events during the past 400 years at Taihu Lake, China

    NASA Astrophysics Data System (ADS)

    Li, Yongfei; Guo, Ya; Yu, Ge

    2013-09-01

    SummaryConsiderable attention has been paid to the extreme floods affecting Taihu Lake and the lower reaches of the Yangtze River (China) since they have caused serious socio-economic problems past and present. To fully understand these low probability events, it is necessary to build longtime-series of flood occurrence using flood-level proxies from archaeology and sedimentology that extends the period of observation beyond that of instrumental data. Using historical stele flood markers relicts and lacustrine sediment records from Taihu Lake, this paper attempts to reconstruct the historical flood events, to compare the stele-measured flood levels with those recorded by the modern gauges, and to identify the extreme flood signals among the flood events over the past 400 years. Results indicate that the lowest lake level of the 15 extreme floods by stele-records in the period 1600-1954 AD was 4.07 m a.s.l., equivalent to the 80th percentile of hydrological-gauged lake levels during 1921-2004 AD; this comparison provides a quantitative analogue for the floods reconstructed from lake sediments. Flood signals from coarse silt-sand sediments and low-frequency magnetic susceptibility profiles captured 85% of flood years in the historical period, and five extreme flood years that were missed in the stele flood records were found. There were three flood years that were identified from the historical documents in 1766, 1875 and 1882 AD, respectively, when lake-flood levels were estimated at 4.0-4.1 m, 4.1-4.2 m and 4.13-4.23 m, respectively. Spectral analysis was used to identify return periods from the three time series of the stele flood records, grain flood index and magnetic flood index, respectively; these analyses indicated some synchronous patterns and showed common return periods of 90-102 years, 60-62 years and 42-44 years. To test if the historical extreme floods have statistical relationships with climate variability, a two-variables-conditional test following

  4. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  5. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    NASA Technical Reports Server (NTRS)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  6. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    SciTech Connect

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  7. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    SciTech Connect

    Anderson, Johan; Halpern, Federico D.; Ricci, Paolo; Furno, Ivo; Xanthopoulos, Pavlos

    2014-12-15

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis of the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.

  8. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    NASA Astrophysics Data System (ADS)

    Trigo, Ricardo; Varino, Filipa; Ramos, Alexandre; Valente, Maria; Zêzere, José; Vaquero, José; Gouveia, Célia; Russo, Ana

    2014-04-01

    The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora), present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over central Atlantic Ocean.

  9. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  10. Wave Climate and Extreme Events Analysis in the Central Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Morucci, S.; Inghilesi, R.; Orasi, A.; Nardone, G.

    2012-04-01

    Wind wave time series are serially correlated, have variable autocorrelation dependent on the geographic position and exhibit different properties on different time scales. While there is evidence of daily and seasonal periodicity, results for longer time scales are not yet conclusive, given the length of the available series. In fact, since the use of accelerometric buoys has been introduced in relatively recent times, waves time series are generally not longer than 30 years. In this study, the statistical analysis on more than 2 decades of wave data, collected on 15 locations in the Central Mediterranean Sea all around the Italian coasts, is presented. Wave recordings have been taken from the archive of the Italian National Wind Wave Measurement Network (RON), run by ISPRA since 1989. An effort has been made in order to provide a common level of homogeneity and quality control to the series. The statistics considered are mainly the Joint Frequency Functions of significant wave heights with respect to directions, peak periods and mean periods. The distribution of significant wave heights and direction, known as 'wave climate', is shown in the form of two-entries tables and wind roses. In order to determine the relative importance of the historical storms in terms of the return times and to estimate the expected values of the wave heights over several decades, the Peak Over Threshold method is applied to sets of independent events extracted from each series. Attention has been focused on the determination of the independent events introducing a specific threshold in the autocorrelation function. Even though the series are limited to a 22 years period, the analysis gives valuable information about the spatial distribution of the storms and their variability on a decadal time scale in the Central Mediterranean Sea.

  11. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains.

    PubMed

    Torre, Emiliano; Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz; Grün, Sonja

    2016-07-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  12. Uncertainty Analysis for a De-pressurised Loss of Forced Cooling Event of the PBMR Reactor

    SciTech Connect

    Jansen van Rensburg, Pieter A.; Sage, Martin G.

    2006-07-01

    This paper presents an uncertainty analysis for a De-pressurised Loss of Forced Cooling (DLOFC) event that was performed with the systems CFD (Computational Fluid Dynamics) code Flownex for the PBMR reactor. An uncertainty analysis was performed to determine the variation in maximum fuel, core barrel and reactor pressure vessel (RPV) temperature due to variations in model input parameters. Some of the input parameters that were varied are: thermo-physical properties of helium and the various solid materials, decay heat, neutron and gamma heating, pebble bed pressure loss, pebble bed Nusselt number and pebble bed bypass flows. The Flownex model of the PBMR reactor is a 2-dimensional axisymmetrical model. It is simplified in terms of geometry and some other input values. However, it is believed that the model adequately indicates the effect of changes in certain input parameters on the fuel temperature and other components during a DLOFC event. Firstly, a sensitivity study was performed where input variables were varied individually according to predefined uncertainty ranges and the results were sorted according to the effect on maximum fuel temperature. In the sensitivity study, only seven variables had a significant effect on the maximum fuel temperature (greater that 5 deg. C). The most significant are power distribution profile, decay heat, reflector properties and effective pebble bed conductivity. Secondly, Monte Carlo analyses were performed in which twenty variables were varied simultaneously within predefined uncertainty ranges. For a one-tailed 95% confidence level, the conservatism that should be added to the best estimate calculation of the maximum fuel temperature for a DLOFC was determined as 53 deg. C. This value will probably increase after some model refinements in the future. Flownex was found to be a valuable tool for uncertainly analyses, facilitating both sensitivity studies and Monte Carlo analyses. (authors)

  13. Analysis and Prediction of West African Moist Events during the Boreal Spring of 2009

    NASA Astrophysics Data System (ADS)

    Mera, Roberto Javier

    Weather and climate in Sahelian West Africa are dominated by two major wind systems, the southwesterly West African Monsoon (WAM) and the northeasterly (Harmattan) trade winds. In addition to the agricultural benefit of the WAM, the public health sector is affected given the relationship between the onset of moisture and end of meningitis outbreaks. Knowledge and prediction of moisture distribution during the boreal spring is vital to the mitigation of meningitis by providing guidance for vaccine dissemination. The goal of the present study is to (a) develop a climatology and conceptual model of the moisture regime during the boreal spring, (b) investigate the role of extra-tropical and Convectively-coupled Equatorial Waves (CCEWs) on the modulation of westward moving synoptic waves and (c) determine the efficacy of a regional model as a tool for predicting moisture variability. Medical reports during 2009, along with continuous meteorological observations at Kano, Nigeria, showed that the advent of high humidity correlated with cessation of the disease. Further analysis of the 2009 boreal spring elucidated the presence of short-term moist events that modulated surface moisture on temporal scales relevant to the health sector. The May moist event (MME) provided insight into interplays among climate anomalies, extra-tropical systems, equatorially trapped waves and westward-propagating synoptic disturbances. The synoptic disturbance initiated 7 May and traveled westward to the coast by 12 May. There was a marked, semi-stationary moist anomaly in the precipitable water field (kg m-2) east of 10°E through late April and early May, that moved westward at the time of the MME. Further inspection revealed a mid-latitude system may have played a role in increasing the latitudinal amplitude of the MME. CCEWs were also found to have an impact on the MME. A coherent Kelvin wave propagated through the region, providing increased monsoonal flow and heightened convection. A

  14. Top-down and bottom-up definitions of human failure events in human reliability analysis

    SciTech Connect

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  15. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  16. Frequency analysis and its spatiotemporal characteristics of precipitation extreme events in China during 1951-2010

    NASA Astrophysics Data System (ADS)

    Shao, Yuehong; Wu, Junmei; Ye, Jinyin; Liu, Yonghe

    2015-08-01

    This study investigates frequency analysis and its spatiotemporal characteristics of precipitation extremes based on annual maximum of daily precipitation (AMP) data of 753 observation stations in China during the period 1951-2010. Several statistical methods including L-moments, Mann-Kendall test (MK test), Student's t test ( t test) and analysis of variance ( F-test) are used to study different statistical properties related to frequency and spatiotemporal characteristics of precipitation extremes. The results indicate that the AMP series of most sites have no linear trends at 90 % confidence level, but there is a distinctive decrease trend in Beijing-Tianjin-Tangshan region. The analysis of abrupt changes shows that there are no significant changes in most sites, and no distinctive regional patterns within the mutation sites either. An important innovation different from the previous studies is the shift in the mean and the variance which are also studied in this paper in order to further analyze the changes of strong and weak precipitation extreme events. The shift analysis shows that we should pay more attention to the drought in North China and to the flood control and drought in South China, especially to those regions that have no clear trend and have a significant shift in the variance. More important, this study conducts the comprehensive analysis of a complete set of quantile estimates and its spatiotemporal characteristic in China. Spatial distribution of quantile estimation based on the AMP series demonstrated that the values gradually increased from the Northwest to the Southeast with the increment of duration and return period, while the increasing rate of estimation is smooth in the arid and semiarid region and is rapid in humid region. Frequency estimates of 50-year return period are in agreement with the maximum observations of AMP series in the most stations, which can provide more quantitative and scientific basis for decision making.

  17. An analysis of large Forbush decrease events using phase diagrams of view channels of the Nagoya multidirectional muon telescope

    NASA Astrophysics Data System (ADS)

    Kalugin, G.; Kabin, K.

    2015-02-01

    Large Forbush decrease (FD) events are analysed using data recorded by the ground-based Nagoya multi-directional muon telescope in Japan. As a part of the analysis we introduce a phase diagram for the channels of telescope, which provides more robust information about characteristics of events. Specifically, the slope of the regression line in the phase diagram represents the FD amplitude which can be computed for different channels. This allows us to analyze the dependence of the FD amplitude on the rigidity of CR particles. Two models for this dependence are considered, a power law and exponential and the former is found to be more suitable for the considered events. In terms of the power-law index and the FD amplitude the events are split into two groups. It is shown that the larger events are characterized by smaller power-law index than the smaller ones.

  18. A Behavior Genetic Analysis of Pleasant Events, Depressive Symptoms, and Their Covariation

    PubMed Central

    Whisman, Mark A.; Johnson, Daniel P.; Rhee, Soo Hyun

    2014-01-01

    Although pleasant events figure prominently in behavioral models of depression, little is known regarding characteristics that may predispose people to engage in pleasant events and derive pleasure from these events. The present study was conducted to evaluate genetic and environmental influences on the experience of pleasant events, depressive symptoms, and their covariation in a sample of 148 twin pairs. A multivariate twin modeling approach was used to examine the genetic and environmental covariance of pleasant events and depressive symptoms. Results indicated that the experience of pleasant events was moderately heritable and that the same genetic factors influence both the experience of pleasant events and depressive symptoms. These findings suggest that genetic factors may give rise to dispositional tendencies to experience both pleasant events and depression. PMID:25506045

  19. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  20. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    SciTech Connect

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. ); Baxter, J.T. ); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. ); Brosseau, D.A. )

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  1. Rain-on-snow Events in Southwestern British Columbia: A Long-term Analysis of Meteorological Conditions and Snowpack Response

    NASA Astrophysics Data System (ADS)

    Trubilowicz, J. W.; Moore, D.

    2015-12-01

    Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.

  2. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    SciTech Connect

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  3. Loss Modeling with a Data-Driven Approach in Event-Based Rainfall-Runoff Analysis

    NASA Astrophysics Data System (ADS)

    Chua, L. H. C.

    2012-04-01

    is completely impervious and the losses are small. Thus, the good agreement of results between the ANN with the KW model results demonstrates the applicability of the ANN model in modeling the loss rate. Comparing the modeled runoff with the measured runoff for the Upper Bukit Timah catchment, it was found that the KW model was not able to produce the runoff from the catchment accurately due to the improper prescription of the loss rate. This is because the loss rate varies over a wide range of values in a real catchment and using the loss rate for an average event did not provide truly representative values for the catchment. Although the same dataset was used in the training of the ANN model, the ANN model was able to produce hydrographs with significantly higher Nash-Sutcliffe coefficients compared to the KW model. This analysis demonstrates that the ANN model is better able to model the highly variable loss rate during storm events, especially if the data used for calibration is limited. ACKNOWLEDGEMENT Funding received from the DHI-NTU Water & Environment Research Centre and Education Hub is gratefully acknowledged.

  4. Retrospective Analysis of Recent Flood Events With Persistent High Surface Runoff From Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Joshi, S.; Hakeem, K. Abdul; Raju, P. V.; Rao, V. V.; Yadav, A.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    /locations with probable flooding conditions. These thresholds were refined through iterative process by comparing with satellite data derived flood maps of 2013 and 2014 monsoon season over India. India encountered many cyclonic flood events during Oct-Dec 2013, among which Phailin, Lehar, and Madi were rated to be very severe cyclonic storm. The path and intensity of these cyclonic events was very well captured by the model and areas were marked with persistent coverage of high runoff risk/flooded area. These thresholds were used to monitor floods in Jammu Kashmir during 4-5 Sep and Odisha during 8-9 Aug, 2014. The analysis indicated the need to vary the thresholds across space considering the terrain and geographical conditions. With respect to this a sub-basin wise study was made based on terrain characteristics (slope, elevation) using Aster DEM. It was found that basins with higher elevation represent higher thresholds as compared to basins with lesser elevation. The results show very promising correlation with the satellite derived flood maps. Further refinement and optimization of thresholds, varying them spatially accounting for topographic/terrain conditions, would lead to estimation of high runoff/flood risk areas for both riverine and drainage congested areas. Use of weather forecast data (NCMWRF, (GEFS/R)), etc. would enhance the scope to develop early warning systems.

  5. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    NASA Astrophysics Data System (ADS)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  6. How does leaving home affect marital timing? An event-history analysis of migration and marriage in Nang Rong, Thailand.

    PubMed

    Jampaklay, Aree

    2006-11-01

    This study examines the effects of migration on marital timing in Thailand between 1984 and 2000 using prospective and retrospective survey data from Nang Rong. In contrast to previous results in the literature, event-history analysis of the longitudinal data reveals a positive, not a negative, effect of lagged migration experience on the likelihood of marriage. The findings also indicate gender differences. Migration's positive impact is independent of other life events for women but is completely "explained" by employment for men.

  7. Full Moment Tensor Analysis of Western US Explosions, Earthquakes, Collapses, and Volcanic Events Using a Regional Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Ford, S. R.; Dreger, D. S.; Walter, W. R.

    2006-12-01

    Seismic moment tensor analysis at regional distances commonly involves solving for the deviatoric moment tensor and decomposing it to characterize the tectonic earthquake source. The full seismic moment tensor solution can also recover the isotropic component of the seismic source, which is theoretically dominant in explosions and collapses, and present in volcanic events. Analysis of events with demonstrably significant isotropic energy can aid in understanding the source processes of volcanic and geothermal seismic events and the monitoring of nuclear explosions. Using a regional time-domain waveform inversion for the complete moment tensor we calculate the deviatoric and isotropic source components for several explosions at the Nevada Test Site (NTS) and earthquakes, collapses, and volcanic events in the surrounding region of the NTS (Western US). The events separate into specific populations according to their deviation from a pure double-couple and ratio of isotropic to deviatoric energy. The separation allows for anomalous event identification and discrimination of explosions, earthquakes, and collapses. Analysis of the source principal axes can characterize the regional stress field, and tectonic release due to explosions. Error in the moment tensor solutions and source parameters is also calculated. We investigate the sensitivity of the moment tensor solutions to Green's functions calculated with imperfect Earth models, inaccurate event locations, and data with a low signal-to-noise ratio. We also test the performance of the method under a range of recording conditions from excellent azimuthal coverage to cases of sparse coverage as might be expected for smaller events. This analysis will be used to determine the magnitude range where well-constrained solutions can be obtained.

  8. SYSTEMS SAFETY ANALYSIS FOR FIRE EVENTS ASSOCIATED WITH THE ECRB CROSS DRIFT

    SciTech Connect

    R. J. Garrett

    2001-12-12

    The purpose of this analysis is to systematically identify and evaluate fire hazards related to the Yucca Mountain Site Characterization Project (YMP) Enhanced Characterization of the Repository Block (ECRB) East-West Cross Drift (commonly referred to as the ECRB Cross-Drift). This analysis builds upon prior Exploratory Studies Facility (ESF) System Safety Analyses and incorporates Topopah Springs (TS) Main Drift fire scenarios and ECRB Cross-Drift fire scenarios. Accident scenarios involving the fires in the Main Drift and the ECRB Cross-Drift were previously evaluated in ''Topopah Springs Main Drift System Safety Analysis'' (CRWMS M&O 1995) and the ''Yucca Mountain Site Characterization Project East-West Drift System Safety Analysis'' (CRWMS M&O 1998). In addition to listing required mitigation/control features, this analysis identifies the potential need for procedures and training as part of defense-in-depth mitigation/control features. The inclusion of this information in the System Safety Analysis (SSA) is intended to assist the organization(s) (e.g., Construction, Environmental Safety and Health, Design) responsible for these aspects of the ECRB Cross-Drift in developing mitigation/control features for fire events, including Emergency Refuge Station(s). This SSA was prepared, in part, in response to Condition/Issue Identification and Reporting/Resolution System (CIRS) item 1966. The SSA is an integral part of the systems engineering process, whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach is used which incorporates operating experiences and recommendations from vendors, the constructor and the operating contractor. The risk assessment in this analysis characterizes the scenarios associated with fires in terms of relative risk and includes recommendations for mitigating all identified hazards. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures

  9. Causal effects of body mass index on cardiometabolic traits and events: a Mendelian randomization analysis.

    PubMed

    Holmes, Michael V; Lange, Leslie A; Palmer, Tom; Lanktree, Matthew B; North, Kari E; Almoguera, Berta; Buxbaum, Sarah; Chandrupatla, Hareesh R; Elbers, Clara C; Guo, Yiran; Hoogeveen, Ron C; Li, Jin; Li, Yun R; Swerdlow, Daniel I; Cushman, Mary; Price, Tom S; Curtis, Sean P; Fornage, Myriam; Hakonarson, Hakon; Patel, Sanjay R; Redline, Susan; Siscovick, David S; Tsai, Michael Y; Wilson, James G; van der Schouw, Yvonne T; FitzGerald, Garret A; Hingorani, Aroon D; Casas, Juan P; de Bakker, Paul I W; Rich, Stephen S; Schadt, Eric E; Asselbergs, Folkert W; Reiner, Alex P; Keating, Brendan J

    2014-02-01

    Elevated body mass index (BMI) associates with cardiometabolic traits on observational analysis, yet the underlying causal relationships remain unclear. We conducted Mendelian randomization analyses by using a genetic score (GS) comprising 14 BMI-associated SNPs from a recent discovery analysis to investigate the causal role of BMI in cardiometabolic traits and events. We used eight population-based cohorts, including 34,538 European-descent individuals (4,407 type 2 diabetes (T2D), 6,073 coronary heart disease (CHD), and 3,813 stroke cases). A 1 kg/m(2) genetically elevated BMI increased fasting glucose (0.18 mmol/l; 95% confidence interval (CI) = 0.12-0.24), fasting insulin (8.5%; 95% CI = 5.9-11.1), interleukin-6 (7.0%; 95% CI = 4.0-10.1), and systolic blood pressure (0.70 mmHg; 95% CI = 0.24-1.16) and reduced high-density lipoprotein cholesterol (-0.02 mmol/l; 95% CI = -0.03 to -0.01) and low-density lipoprotein cholesterol (LDL-C; -0.04 mmol/l; 95% CI = -0.07 to -0.01). Observational and causal estimates were directionally concordant, except for LDL-C. A 1 kg/m(2) genetically elevated BMI increased the odds of T2D (odds ratio [OR] = 1.27; 95% CI = 1.18-1.36) but did not alter risk of CHD (OR 1.01; 95% CI = 0.94-1.08) or stroke (OR = 1.03; 95% CI = 0.95-1.12). A meta-analysis incorporating published studies reporting 27,465 CHD events in 219,423 individuals yielded a pooled OR of 1.04 (95% CI = 0.97-1.12) per 1 kg/m(2) increase in BMI. In conclusion, we identified causal effects of BMI on several cardiometabolic traits; however, whether BMI causally impacts CHD risk requires further evidence. PMID:24462370

  10. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  11. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    SciTech Connect

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  12. Using Wavelet Analysis To Assist in Identification of Significant Events in Molecular Dynamics Simulations.

    PubMed

    Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E

    2016-07-25

    Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired. PMID:27286268

  13. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  14. BIRD detection and analysis of high-temperature events: first results

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2003-03-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite, which was put in a 570 km circular sun-synchronous orbit on 22 October 2001, is detection and quantitative analysis of high-temperature events (HTE) like fires and volcanoes. A unique feature of the BIRD mid- and thermal infrared channels is a real-time adjustment of their integration time that allows a HTE observation without sensor saturation, preserving a good radiometric resolution of 0.1-0.2 K for pixels at normal temperatures. This makes it possible: (a) to improve false alarm rejection capability and (b) to estimate HTE temperature, area and radiative energy release. Due to a higher spatial resolution, BIRD can detect an order of magnitude smaller HTE than AVHRR and MODIS. The smallest verified fire that was detected in the BIRD data had an area of ~12 m2. The first BIRD HTE detection and analysis results are presented including bush fires in Australia, forest fires in Russia, coal seam fires in China, and a time-varying thermal activity at Etna.

  15. Parametric studies of penetration events : a design and analysis of experiments approach.

    SciTech Connect

    Chiesa, Michael L.; Marin, Esteban B.; Booker, Paul M.

    2005-02-01

    A numerical screening study of the interaction between a penetrator and a geological target with a preformed hole has been carried out to identify the main parameters affecting the penetration event. The planning of the numerical experiment was based on the orthogonal array OA(18,7,3,2), which allows 18 simulation runs with 7 parameters at 3 levels each. The strength of 2 of the array allows also for two-factor interaction studies. The seven parameters chosen for this study are: penetrator offset, hole diameter, hole taper, vertical and horizontal velocity of the penetrator, angle of attack of the penetrator and target material. The analysis of the simulation results has been based on main effects plots and analysis of variance (ANOVA), and it has been performed using three metrics: the maximum values of the penetration depth, penetrator deceleration and plastic strain in the penetrator case. This screening study shows that target material has a major influence on penetration depth and penetrator deceleration, while penetrator offset has the strongest effect on the maximum plastic strain.

  16. Full genomic analysis of new variant rabbit hemorrhagic disease virus revealed multiple recombination events.

    PubMed

    Lopes, Ana M; Dalton, Kevin P; Magalhães, Maria J; Parra, Francisco; Esteves, Pedro J; Holmes, Edward C; Abrantes, Joana

    2015-06-01

    Rabbit hemorrhagic disease virus (RHDV), a Lagovirus of the family Caliciviridae, causes rabbit hemorrhagic disease (RHD) in the European rabbit (Oryctolagus cuniculus). The disease was first documented in 1984 in China and rapidly spread worldwide. In 2010, a new RHDV variant emerged, tentatively classified as 'RHDVb'. RHDVb is characterized by affecting vaccinated rabbits and those <2 months old, and is genetically distinct (~20 %) from older strains. To determine the evolution of RHDV, including the new variant, we generated 28 full-genome sequences from samples collected between 1994 and 2014. Phylogenetic analysis of the gene encoding the major capsid protein, VP60, indicated that all viruses sampled from 2012 to 2014 were RHDVb. Multiple recombination events were detected in the more recent RHDVb genomes, with a single major breakpoint located in the 5' region of VP60. This breakpoint divides the genome into two regions: one that encodes the non-structural proteins and another that encodes the major and minor structural proteins, VP60 and VP10, respectively. Additional phylogenetic analysis of each region revealed two types of recombinants with distinct genomic backgrounds. Recombinants always include the structural proteins of RHDVb, with non-structural proteins from non-pathogenic lagoviruses or from pathogenic genogroup 1 strains. Our results show that in contrast to the evolutionary history of older RHDV strains, recombination plays an important role in generating diversity in the newly emerged RHDVb.

  17. ANTARES: The Arizona-NOAO Temporal Analysis and Response to Events System

    NASA Astrophysics Data System (ADS)

    Matheson, T.; Saha, A.; Snodgrass, R.; Kececioglu, J.

    The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. The goal is to build the software infrastructure necessary to process and filter alerts produced by time-domain surveys, with the ultimate source of such alerts being the Large Synoptic Survey Telescope (LSST). ANTARES will add value to alerts by annotating them with information from external sources such as previous surveys from across the electromagnetic spectrum. In addition, the temporal history of annotated alerts will provide further annotation for analysis. These alerts will go through a cascade of filters to select interesting candidates. For the prototype, 'interesting' is defined as the rarest or most unusual alert, but future systems will accommodate multiple filtering goals. The system is designed to be flexible, allowing users to access the stream at multiple points throughout the process, and to insert custom filters where necessary. We will describe the basic architecture of ANTARES and the principles that will guide development and implementation.

  18. Life-threatening adverse events following therapeutic opioid administration in adults: Is pharmacogenetic analysis useful?

    PubMed Central

    Madadi, Parvaz; Sistonen, Johanna; Silverman, Gregory; Gladdy, Rebecca; Ross, Colin J; Carleton, Bruce C; Carvalho, Jose C; Hayden, Michael R; Koren, Gideon

    2013-01-01

    BACKGROUND: Systemic approaches are needed to understand how variations in the genes associated with opioid pharmacokinetics and response can be used to predict patient outcome. The application of pharmacogenetic analysis to two cases of life-threatening opioid-induced respiratory depression is presented. The usefulness of genotyping in the context of these cases is discussed. METHODS: A panel of 20 functional candidate polymorphisms in genes involved in the opioid biotransformation pathway (CYP2D6, UGT2B7, ABCB1, OPRM1, COMT) were genotyped in these two patients using commercially available genotyping assays. RESULTS: In case 1, the patient experienced adverse outcomes when administered codeine and morphine, but not hydromorphone. Genetic test results suggested that this differential response may be due to an inherent propensity to generate active metabolites from both codeine and morphine. These active metabolites are not generated with hydromorphone. In case 2, the patient experienced severe respiratory depression during postoperative recovery following standard doses of morphine. The patient was found to carry genetic variations that result in decreased morphine efflux transporter activity at the blood-brain barrier and increased sensitivity to opioids. CONCLUSIONS: Knowledge of the relative contribution of pharmacogenetic biomarkers and their influence on opioid response are continually evolving. Pharmacogenetic analysis, together with clinical history, has the potential to provide mechanistic insight into severe respiratory depressive events in patients who receive opioids at therapeutic doses. PMID:23748253

  19. Regularized Deterministic Annealing Hidden Markov Models for Identificationand Analysis of Seismic and Aseismic events.

    NASA Astrophysics Data System (ADS)

    Granat, R. A.; Clayton, R.; Kedar, S.; Kaneko, Y.

    2003-12-01

    We employ a robust hidden Markov model (HMM) based technique to perform statistical pattern analysis of suspected seismic and aseismic events in the poorly explored period band of minutes to hours. The technique allows us to classify known events and provides a statistical basis for finding and cataloging similar events represented elsewhere in the observations. In this work, we focus on data collected by the Southern California TriNet system. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. Application of the model to data involves optimizing these model parameters with respect to some function of the observations, typically the likelihood of the observations given the model. Our work focused on the fact that this objective function has a number of local maxima that is exponential in the model size (the number of states). This means that not only is it very difficult to discover the global maximum, but also that results can vary widely between applications of the model. For some domains which employ HMMs for such purposes, such as speech processing, sufficient a priori information about the system is available to avoid this problem. However, for seismic data in general such a priori information is not available. Our approach involves analytical location of sub-optimal local maxima; once the locations of these maxima have been found, then we can employ a

  20. Analysis of Scaling Parameters of Event Magnitudes by Fluid Injections in Reservoirs

    NASA Astrophysics Data System (ADS)

    Dinske, Carsten; Krüger, Oliver; Shapiro, Serge

    2014-05-01

    We continue to elaborate scaling parameters of observed frequency-magnitude distributions of injection-induced seismicity. In addition to pumped fluid mass, b-value and seismogenic index (Shapiro et al., 2010, Dinske and Shapiro, 2013), one more scaling was recognised by the analysis of the induced event magnitudes. A frequently observed under-representation of events with larger magnitudes in comparison with the Gutenberg-Richter relation is explained by the geometry and the dimensions of the hydraulically stimulated rock volume (Shapiro et al., 2011, 2013). This under-representation, however, introduces a bias in b-value estimations which then should be considered as an apparent and transient b-value depending on the size of the perturbed rock volume. We study in detail in which way the seismogenic index estimate is affected by the apparent b-value. For this purpose, we compare b-value and seismogenic index estimates using two different approaches. First, we perform standard Gutenberg-Richter power-law fitting and second, we apply frequency-magnitude lower bound probability fitting as proposed by Shapiro et al. (2013). The latter takes into account the finite size of the perturbed rock volume. Our result reveals that the smaller is the perturbed rock volume the larger are the deviations between the two sets of derived parameters. It means that the magnitude statistics of the induced events is most affected for low injection volumes and/or short injection times. At sufficiently large stimulated volumes both fitting approaches provide comparable b-value and seismogenic index estimates. In particular, the b-value is then in the range of b-values universally obtained for tectonic earthquakes (i.e., 0.8 - 1.2). Based on our findings, we introduce the specific magnitude which is a seismotectonic characteristic for a reservoir location. Defined as the ratio of seismogenic index and b-value, the specific magnitude is found to be a magnitude scaling parameter which is

  1. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  2. 'HESPERIA' HORIZON 2020 project: High Energy Solar Particle Events foRecastIng and Analysis

    NASA Astrophysics Data System (ADS)

    Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma; Bindi, Veronica; Murphy, Ronald; Tyka, Allan J.; Rodriguez, Juan

    2016-04-01

    Solar energetic particles (SEPs) are of prime interest for fundamental astrophysics. However, due to their high energies they are a space weather concern for technology in space as well as human space exploration calling for reliable tools with predictive capabilities. The two-year EU HORIZON 2020 project HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis, http://www.hesperia-space.eu/) will produce two novel operational SEP forecasting tools based upon proven concepts (UMASEP, REleASE). At the same time the project will advance our understanding of the physical mechanisms that result into high-energy SEP events through the systematic exploitation of the high-energy gamma-ray observations of the FERMI mission and other novel published datasets (PAMELA, AMS), together with in situ SEP measurements near 1 AU. By using multi-frequency observations and performing simulations, the project will address the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to their detection near 1 AU. Furthermore, HESPERIA will explore the possibility of incorporating the derived results into future innovative space weather services. Publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters, giving information on the high-energy processes occurring at or near the Sun during solar eruptions, will be provided for the first time. The results of this inversion software will complement the space-borne measurements at adjacent higher energies. In order to achieve these goals HESPERIA will exploit already existing large datasets that are stored into databases built under EU FP7 projects NMDB and SEPServer. The structure of the HESPERIA project, its main objectives and forecasting operational tools, as well as the added value to SEP research will be presented and discussed. Acknowledgement: This project has received funding from the

  3. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Integrated survival analysis using an event-time approach in a Bayesian framework.

    PubMed

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  5. Metamizole-Associated Adverse Events: A Systematic Review and Meta-Analysis

    PubMed Central

    Fässler, Margrit; Blozik, Eva; Linde, Klaus; Jüni, Peter; Reichenbach, Stephan; Scherer, Martin

    2015-01-01

    Background Metamizole is used to treat pain in many parts of the world. Information on the safety profile of metamizole is scarce; no conclusive summary of the literature exists. Objective To determine whether metamizole is clinically safe compared to placebo and other analgesics. Methods We searched CENTRAL, MEDLINE, EMBASE, CINAHL, and several clinical trial registries. We screened the reference lists of included trials and previous systematic reviews. We included randomized controlled trials that compared the effects of metamizole, administered to adults in any form and for any indication, to other analgesics or to placebo. Two authors extracted data regarding trial design and size, indications for pain medication, patient characteristics, treatment regimens, and methodological characteristics. Adverse events (AEs), serious adverse events (SAEs), and dropouts were assessed. We conducted separate meta-analyses for each metamizole comparator, using standard inverse-variance random effects meta-analysis to pool the estimates across trials, reported as risk ratios (RRs). We calculated the DerSimonian and Laird variance estimate T2 to measure heterogeneity between trials. The pre-specified primary end point was any AE during the trial period. Results Of the 696 potentially eligible trials, 79 trials including almost 4000 patients with short-term metamizole use of less than two weeks met our inclusion criteria. Fewer AEs were reported for metamizole compared to opioids, RR = 0.79 (confidence interval 0.79 to 0.96). We found no differences between metamizole and placebo, paracetamol and NSAIDs. Only a few SAEs were reported, with no difference between metamizole and other analgesics. No agranulocytosis or deaths were reported. Our results were limited by the mediocre overall quality of the reports. Conclusion For short-term use in the hospital setting, metamizole seems to be a safe choice when compared to other widely used analgesics. High-quality, adequately sized

  6. Evidence for potential and inductive convection during intense geomagnetic events using normalized superposed epoch analysis

    NASA Astrophysics Data System (ADS)

    Katus, Roxanne M.; Liemohn, Michael W.; Gallagher, Dennis L.; Ridley, Aaron; Zou, Shasha

    2013-01-01

    Abstract<p label="1">The relative contribution of storm-time ring current development by convection driven by either potential or inductive electric fields has remained an unresolved question in geospace research. Studies have been published supporting each side of this debate, including views that ring current buildup is entirely one or the other. This study presents new insights into the relative roles of these storm main phase processes. We perform a superposed epoch study of 97 intense (DstMin < -100 nT) and 91 moderate (-50 nT > DstMin > -100 nT) storms using OMNI solar wind and ground-based data. Instead of using a single reference time for the superpositioning of the <span class="hlt">events</span>, we choose four reference times and expand or contract each phase of every <span class="hlt">event</span> to the average length of this phase, creating a normalized timeline for the superposed epoch <span class="hlt">analysis</span>. Using the bootstrap method, we statistically demonstrate that timeline normalization results in better reproduction of average storm dynamics than conventional methods. Examination of the Dst reveals an inflection point in the intense storm group consistent with two-step main phase development, which is supported by results for the southward interplanetary magnetic field and various ground-based magnetic indices. This two-step main-phase process is not seen in the moderate storm timeline and data sets. It is determined that the first step of Dst development is due to potential convective drift, during which an initial ring current is formed. The negative feedback of this hot ion population begins to limit further ring current growth. The second step of the main phase, however, is found to be a more even mix of potential and inductive convection. It is hypothesized that this is necessary to achieve intense storm Dst levels because the substorm dipolarizations are effective at breaking through the negative feedback barrier of the existing inner magnetospheric hot ion pressure peak.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2015AGUFM.A23D0344J&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2015AGUFM.A23D0344J&link_type=ABSTRACT"><span id="translatedtitle">Momentum Budget <span class="hlt">Analysis</span> of Westerly Wind <span class="hlt">Events</span> Associated with the Madden-Julian Oscillation during DYNAMO</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jiang, X.; Oh, J. H.; Waliser, D. E.; Moncrieff, M. W.; Johnson, R. H.; Ciesielski, P. E.</p> <p>2015-12-01</p> <p>The Dynamics of the Madden-Julian Oscillation (DYNAMO) field campaign was conducted over the Indian Ocean (IO) from October 2011 to February 2012 to investigate the initiation of the Madden-Julian Oscillation (MJO). Three MJOs accompanying one or more westerly wind <span class="hlt">events</span> (WWEs) occurred in late October, late November, and late December 2011, respectively. Momentum budget <span class="hlt">analysis</span> is conducted in this study to understand the contributions of the dynamical processes involved in the wind evolution associated with the MJO active phases over the IO during DYNAMO using European Centre for Medium-Range Weather Forecasts (ECMWF) <span class="hlt">analysis</span>. This <span class="hlt">analysis</span> shows that westerly acceleration at lower levels associated with the MJO active phase generally appears to be maintained by the pressure gradient force (PGF), which is partly canceled by meridional advection of the zonal wind. Westerly acceleration in the mid-troposphere is mostly attributable to vertical advection. In addition, the MJO in late November (MJO2), accompanied by two different WWEs (WWE1, WWE2) spaced a few days apart, is further diagnosed. Unlike other WWEs during DYNAMO, horizontal advection is more responsible for the westerly acceleration in the lower troposphere for the WWE2 than the PGF. Interactions between the MJO2 convective envelope and convectively coupled waves (CCWs) have been further analyzed to illuminate the dynamical contribution of these synoptic scale equatorial waves to the WWEs during MJO2. We suggest that differences in the developing processes among WWEs can be attributed to the different types of CCWs.The Dynamics of the Madden-Julian Oscillation (DYNAMO) field campaign was conducted over the Indian Ocean (IO) from October 2011 to February 2012 to investigate the initiation of the Madden-Julian Oscillation (MJO). Three MJOs accompanying one or more westerly wind <span class="hlt">events</span> (WWEs) occurred in late October, late November, and late December 2011, respectively. Momentum budget <span class="hlt">analysis</span> is</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015PhDT........60V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015PhDT........60V"><span id="translatedtitle">An <span class="hlt">analysis</span> of high-impact, low-predictive skill severe weather <span class="hlt">events</span> in the northeast U.S</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vaughan, Matthew T.</p> <p></p> <p>An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather <span class="hlt">events</span> with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill <span class="hlt">events</span> between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather <span class="hlt">events</span> with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill <span class="hlt">events</span> are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill <span class="hlt">events</span>. Type 1 <span class="hlt">events</span> were found to occur in every year of the dataset with an average of 6 <span class="hlt">events</span> per year. Type 2 <span class="hlt">events</span> occur less frequently and are more common in the earlier half of the study period. An <span class="hlt">event</span>-centered composite <span class="hlt">analysis</span> is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather <span class="hlt">events</span> with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact <span class="hlt">events</span> with less than 31-kt shear have lower threat scores than high-impact <span class="hlt">events</span> with higher shear values. Case study <span class="hlt">analysis</span> of type 1 <span class="hlt">events</span> suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25188753','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25188753"><span id="translatedtitle">Conceptualizing the impact of special <span class="hlt">events</span> on community health service levels: an operational <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lund, Adam; Turris, Sheila A; Bowles, Ron</p> <p>2014-10-01</p> <p>Mass gatherings (MG) impact their host and surrounding communities and with inadequate planning, may impair baseline emergency health services. Mass gatherings do not occur in a vacuum; they have both consumptive and disruptive effects that extend beyond the <span class="hlt">event</span> itself. Mass gatherings occur in real geographic locations that include not only the <span class="hlt">event</span> site, but also the surrounding neighborhoods and communities. In addition, the impact of small, medium, or large special <span class="hlt">events</span> may be felt for days, or even months, prior to and following the actual <span class="hlt">events</span>. Current MG reports tend to focus on the <span class="hlt">events</span> themselves during published <span class="hlt">event</span> dates and may underestimate the full impact of a given MG on its host community. In order to account for, and mitigate, the full effects of MGs on community health services, researchers would benefit from a common model of community impact. Using an operations lens, two concepts are presented, the "vortex" and the "ripple," as metaphors and a theoretical model for exploring the broader impact of MGs on host communities. Special <span class="hlt">events</span> and MGs impact host communities by drawing upon resources (vortex) and by disrupting normal, baseline services (ripple). These effects are felt with diminishing impact as one moves geographically further from the <span class="hlt">event</span> center, and can be felt before, during, and after the <span class="hlt">event</span> dates. Well executed medical and safety plans for <span class="hlt">events</span> with appropriate, comprehensive risk assessments and stakeholder engagement have the best chance of ameliorating the potential negative impact of MGs on communities.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ993257.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ993257.pdf"><span id="translatedtitle">An Internal Evaluation of the National FFA Agricultural Mechanics Career Development <span class="hlt">Event</span> through <span class="hlt">Analysis</span> of Individual and Team Scores from 1996-2006</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Franklin, Edward A.; Armbruster, James</p> <p>2012-01-01</p> <p>The purpose of this study was to conduct an internal evaluation of the National FFA Agricultural Mechanics Career Development <span class="hlt">Event</span> (CDE) through <span class="hlt">analysis</span> of individual and team scores from 1996-2006. Data were analyzed by overall and sub-<span class="hlt">event</span> areas scores for individual contestants and team <span class="hlt">event</span>. To facilitate the <span class="hlt">analysis</span> process scores were…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/11241','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/11241"><span id="translatedtitle"><span class="hlt">Analysis</span> of the effects of corrosion probe on riser 241-AN-102-WST-16 during seismic <span class="hlt">event</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>ZIADA, H.H.</p> <p>1998-11-06</p> <p>This <span class="hlt">analysis</span> supports the installation activity of the corrosion probe in Tank 241-AN-102. The probe is scheduled to be installed in Riser 241-AN-102-WST-16 (formerly known as Riser 15B). The purpose of this <span class="hlt">analysis</span> is to evaluate the potential effect of the corrosion probe on the riser during a credible seismic <span class="hlt">event</span>. The previous <span class="hlt">analysis</span> (HNF 1997a) considered only pump jet impingement loading.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.5496R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.5496R"><span id="translatedtitle">Blind Source Separation of Seismic <span class="hlt">Events</span> with Independent Component <span class="hlt">Analysis</span>: CTBT related exercise</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rozhkov, Mikhail; Kitov, Ivan</p> <p>2015-04-01</p> <p>Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical <span class="hlt">Analysis</span> (ETA) conducted in CTBTO to improve the estimated values for the standard signal and <span class="hlt">event</span> parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component <span class="hlt">Analysis</span> (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic <span class="hlt">events</span> (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/26148993','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/26148993"><span id="translatedtitle"><span class="hlt">Analysis</span> of interval-censored recurrent <span class="hlt">event</span> processes subject to resolution.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Shen, Hua; Cook, Richard J</p> <p>2015-09-01</p> <p>Interval-censored recurrent <span class="hlt">event</span> data arise when the <span class="hlt">event</span> of interest is not readily observed but the cumulative <span class="hlt">event</span> count can be recorded at periodic assessment times. In some settings, chronic disease processes may resolve, and individuals will cease to be at risk of <span class="hlt">events</span> at the time of disease resolution. We develop an expectation-maximization algorithm for fitting a dynamic mover-stayer model to interval-censored recurrent <span class="hlt">event</span> data under a Markov model with a piecewise-constant baseline rate function given a latent process. The model is motivated by settings in which the <span class="hlt">event</span> times and the resolution time of the disease process are unobserved. The likelihood and algorithm are shown to yield estimators with small empirical bias in simulation studies. Data are analyzed on the cumulative number of damaged joints in patients with psoriatic arthritis where individuals experience disease remission. PMID:26148993</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4111818','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4111818"><span id="translatedtitle">Incidence and pattern of 12 years of reported transfusion adverse <span class="hlt">events</span> in Zimbabwe: a retrospective <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Mafirakureva, Nyashadzaishe; Khoza, Star; Mvere, David A.; Chitiyo, McLeod E.; Postma, Maarten J.; van Hulst, Marinus</p> <p>2014-01-01</p> <p>Background Haemovigilance hinges on a systematically structured reporting system, which unfortunately does not always exist in resource-limited settings. We determined the incidence and pattern of transfusion-related adverse <span class="hlt">events</span> reported to the National Blood Service Zimbabwe. Materials and methods A retrospective review of the transfusion-<span class="hlt">event</span> records of the National Blood Service Zimbabwe was conducted covering the period from 1 January 1999 to 31 December 2011. All transfusion-related <span class="hlt">event</span> reports received during the period were analysed. Results A total of 308 transfusion adverse <span class="hlt">events</span> (0.046%) were reported for 670,625 blood components distributed. The majority (61.6%) of the patients who experienced an adverse <span class="hlt">event</span> were female. The median age was 36 years (range, 1–89 years). The majority (68.8%) of the adverse <span class="hlt">events</span> were acute transfusion reactions consisting of febrile non-haemolytic transfusion reactions (58.5%), minor allergies (31.6%), haemolytic reactions (5.2%), severe allergic reactions (2.4%), anaphylaxis (1.4%) and hypotension (0.9%). Two-thirds (66.6%) of the adverse <span class="hlt">events</span> occurred following administration of whole blood, although only 10.6% of the blood was distributed as whole blood. Packed cells, which accounted for 75% of blood components distributed, were associated with 20.1% of the <span class="hlt">events</span>. Discussion The incidence of suspected transfusion adverse <span class="hlt">events</span> was generally lower than the incidences reported globally in countries with well-established haemovigilance systems. The administration of whole blood was disproportionately associated with transfusion adverse <span class="hlt">events</span>. The pattern of the transfusion adverse <span class="hlt">events</span> reported here highlights the probable differences in practice between different settings. Under-reporting of transfusion <span class="hlt">events</span> is rife in passive reporting systems. PMID:24887217</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016BlgAJ..25...53N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016BlgAJ..25...53N"><span id="translatedtitle">The <span class="hlt">analysis</span> of the <span class="hlt">events</span> of stellar visibility in Pliny's "Natural History"</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nickiforov, M. G.</p> <p>2016-07-01</p> <p>The Book XVIII of Pliny's "Natural History" contains about a hundred descriptions of the <span class="hlt">events</span> of stellar visibility, which were used for the needs of agricultural calendar. The comparison between the calculated date of each <span class="hlt">event</span> and the date given by Pliny shows that actual <span class="hlt">events</span> of stellar visibility occurred systematically about ~10 days later with respect to the specified time. This discrepancy cannot be explained by errors of the calendar.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/27294990','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/27294990"><span id="translatedtitle">Dealing With Major Life <span class="hlt">Events</span> and Transitions: A Systematic Literature Review on and Occupational <span class="hlt">Analysis</span> of Spirituality.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter</p> <p>2016-01-01</p> <p>This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life <span class="hlt">event</span> or transition. The researchers investigated studies that used narrative <span class="hlt">analysis</span> or a phenomenological methodology related to the topic. Thematic <span class="hlt">analysis</span> resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life <span class="hlt">event</span> or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention. PMID:27294990</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.6683C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.6683C"><span id="translatedtitle"><span class="hlt">Analysis</span> of post-blasting source mechanisms of mining-induced seismic <span class="hlt">events</span> in Rudna copper mine, Poland.</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam</p> <p>2016-04-01</p> <p>Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic <span class="hlt">events</span>. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an <span class="hlt">analysis</span> of post-blasting <span class="hlt">events</span> using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on underground seismic network. We show that focal mechanisms for <span class="hlt">events</span> that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC) component of the MT, indicate that these <span class="hlt">events</span> were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for common strong mining <span class="hlt">events</span>. We believe that seismological <span class="hlt">analysis</span> of provoked and unprovoked <span class="hlt">events</span> can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25746390','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25746390"><span id="translatedtitle">Prediction of clinical risks by <span class="hlt">analysis</span> of preclinical and clinical adverse <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Clark, Matthew</p> <p>2015-04-01</p> <p>This study examines the ability of nonclinical adverse <span class="hlt">event</span> observations to predict human clinical adverse <span class="hlt">events</span> observed in drug development programs. In addition it examines the relationship between nonclinical and clinical adverse <span class="hlt">event</span> observations to drug withdrawal and proposes a model to predict drug withdrawal based on these observations. These analyses provide risk assessments useful for both planning patient safety programs, as well as a statistical framework for assessing the future success of drug programs based on nonclinical and clinical observations. Bayesian analyses were undertaken to investigate the connection between nonclinical adverse <span class="hlt">event</span> observations and observations of that same <span class="hlt">event</span> in clinical trial for a large set of approved drugs. We employed the same statistical methods used to evaluate the efficacy of diagnostic tests to evaluate the ability of nonclinical studies to predict adverse <span class="hlt">events</span> in clinical studies, and adverse <span class="hlt">events</span> in both to predict drug withdrawal. We find that some nonclinical observations suggest higher risk for observing the same adverse <span class="hlt">event</span> in clinical studies, particularly arrhythmias, QT prolongation, and abnormal hepatic function. However the lack of these <span class="hlt">events</span> in nonclinical studies is found to not be a good predictor of safety in humans. Some nonclinical and clinical observations appear to be associated with high risk of drug withdrawal from market, especially arrhythmia and hepatic necrosis. We use the method to estimate the overall risk of drug withdrawal from market using the product of the risks from each nonclinical and clinical observation to create a risk profile.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/25746390','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/25746390"><span id="translatedtitle">Prediction of clinical risks by <span class="hlt">analysis</span> of preclinical and clinical adverse <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Clark, Matthew</p> <p>2015-04-01</p> <p>This study examines the ability of nonclinical adverse <span class="hlt">event</span> observations to predict human clinical adverse <span class="hlt">events</span> observed in drug development programs. In addition it examines the relationship between nonclinical and clinical adverse <span class="hlt">event</span> observations to drug withdrawal and proposes a model to predict drug withdrawal based on these observations. These analyses provide risk assessments useful for both planning patient safety programs, as well as a statistical framework for assessing the future success of drug programs based on nonclinical and clinical observations. Bayesian analyses were undertaken to investigate the connection between nonclinical adverse <span class="hlt">event</span> observations and observations of that same <span class="hlt">event</span> in clinical trial for a large set of approved drugs. We employed the same statistical methods used to evaluate the efficacy of diagnostic tests to evaluate the ability of nonclinical studies to predict adverse <span class="hlt">events</span> in clinical studies, and adverse <span class="hlt">events</span> in both to predict drug withdrawal. We find that some nonclinical observations suggest higher risk for observing the same adverse <span class="hlt">event</span> in clinical studies, particularly arrhythmias, QT prolongation, and abnormal hepatic function. However the lack of these <span class="hlt">events</span> in nonclinical studies is found to not be a good predictor of safety in humans. Some nonclinical and clinical observations appear to be associated with high risk of drug withdrawal from market, especially arrhythmia and hepatic necrosis. We use the method to estimate the overall risk of drug withdrawal from market using the product of the risks from each nonclinical and clinical observation to create a risk profile. PMID:25746390</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4987052','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4987052"><span id="translatedtitle">Assessment of Adverse <span class="hlt">Events</span> in Protocols, Clinical Study Reports, and Published Papers of Trials of Orlistat: A Document <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Schroll, Jeppe Bennekou; Penninga, Elisabeth I.; Gøtzsche, Peter C.</p> <p>2016-01-01</p> <p> filters, though six of seven papers stated that “all adverse <span class="hlt">events</span> were recorded.” For one trial, we identified an additional 1,318 adverse <span class="hlt">events</span> that were not listed or mentioned in the CSR itself but could be identified through manually counting individual adverse <span class="hlt">events</span> reported in an appendix. We discovered that the majority of patients had multiple episodes of the same adverse <span class="hlt">event</span> that were only counted once, though this was not described in the CSRs. We also discovered that participants treated with orlistat experienced twice as many days with adverse <span class="hlt">events</span> as participants treated with placebo (22.7 d versus 14.9 d, p-value < 0.0001, Student’s t test). Furthermore, compared with the placebo group, adverse <span class="hlt">events</span> in the orlistat group were more severe. None of this was stated in the CSR or in the published paper. Our <span class="hlt">analysis</span> was restricted to one drug tested in the mid-1990s; our results might therefore not be applicable for newer drugs. Conclusions In the orlistat trials, we identified important disparities in the reporting of adverse <span class="hlt">events</span> between protocols, clinical study reports, and published papers. Reports of these trials seemed to have systematically understated adverse <span class="hlt">events</span>. Based on these findings, systematic reviews of drugs might be improved by including protocols and CSRs in addition to published articles. PMID:27529343</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24760910','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24760910"><span id="translatedtitle">Spatial filtering based on canonical correlation <span class="hlt">analysis</span> for classification of evoked or <span class="hlt">event</span>-related potentials in EEG data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Spüler, Martin; Walter, Armin; Rosenstiel, Wolfgang; Bogdan, Martin</p> <p>2014-11-01</p> <p>Classification of evoked or <span class="hlt">event</span>-related potentials is an important prerequisite for many types of brain-computer interfaces (BCIs). To increase classification accuracy, spatial filters are used to improve the signal-to-noise ratio of the brain signals and thereby facilitate the detection and classification of evoked or <span class="hlt">event</span>-related potentials. While canonical correlation <span class="hlt">analysis</span> (CCA) has previously been used to construct spatial filters that increase classification accuracy for BCIs based on visual evoked potentials, we show in this paper, how CCA can also be used for spatial filtering of <span class="hlt">event</span>-related potentials like P300. We also evaluate the use of CCA for spatial filtering on other data with evoked and <span class="hlt">event</span>-related potentials and show that CCA performs consistently better than other standard spatial filtering methods.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016NIMPA.821..142R&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016NIMPA.821..142R&link_type=ABSTRACT"><span id="translatedtitle">Statistical-noise reduction in correlation <span class="hlt">analysis</span> of high-energy nuclear collisions with <span class="hlt">event</span>-mixing</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ray, R. L.; Bhattarai, P.</p> <p>2016-06-01</p> <p>The error propagation and statistical-noise reduction method of Reid and Trainor for two-point correlation applications in high-energy collisions is extended to include particle-pair references constructed by mixing two particles from all <span class="hlt">event</span>-pair combinations within <span class="hlt">event</span> subsets of arbitrary size. The Reid-Trainor method is also applied to other particle-pair mixing algorithms commonly used in correlation <span class="hlt">analysis</span> of particle production from high-energy nuclear collisions. The statistical-noise reduction, inherent in the Reid-Trainor <span class="hlt">event</span>-mixing procedure, is shown to occur for these other <span class="hlt">event</span>-mixing algorithms as well. Monte Carlo simulation results are presented which verify the predicted degree of noise reduction. In each case the final errors are determined by the bin-wise particle-pair number, rather than by the bin-wise single-particle count.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26794515','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26794515"><span id="translatedtitle">Biochemical <span class="hlt">analysis</span> of axon-specific phosphorylation <span class="hlt">events</span> using isolated squid axoplasms.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kang, Minsu; Baker, Lisa; Song, Yuyu; Brady, Scott T; Morfini, Gerardo</p> <p>2016-01-01</p> <p>Appropriate functionality of nodes of Ranvier, presynaptic terminals, and other axonal subdomains depends on efficient and timely delivery of proteins synthesized and packaged into membrane-bound organelles (MBOs) within the neuronal cell body. MBOs are transported and delivered to their final sites of utilization within axons by a cellular process known as fast axonal transport (FAT). Conventional kinesin, the most abundant multisubunit motor protein expressed in mature neurons, is responsible for FAT of a large variety of MBOs and plays a major role in the maintenance of appropriate axonal connectivity. Consistent with the variety and large number of discrete subdomains within axons, experimental evidence revealed the identity of several protein kinases that modulate specific functional activities of conventional kinesin. Thus, methods for the <span class="hlt">analysis</span> of kinase activity and conventional kinesin phosphorylation facilitate the study of FAT regulation in health and disease conditions. Axonal degeneration, abnormal patterns of protein phosphorylation, and deficits in FAT represent early pathological features characteristic of neurological diseases caused by unrelated neuropathogenic proteins. Interestingly, some of these proteins were shown to produce deficits in FAT by modulating the activity of specific protein kinases involved in conventional kinesin phosphorylation. However, experimental systems that facilitate an evaluation of molecular <span class="hlt">events</span> within axons remain scarce. Using the isolated squid axoplasm preparation, we describe methods for evaluating axon-autonomous effects of neuropathogenic proteins on the activity of protein kinases. Protocols are also provided to evaluate the effect of such proteins on the phosphorylation of endogenous axonal substrates, including conventional kinesin and neurofilaments.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3843947','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3843947"><span id="translatedtitle">Men’s and women’s migration in coastal Ghana: An <span class="hlt">event</span> history <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Reed, Holly E.; Andrzejewski, Catherine S.; White, Michael J.</p> <p>2013-01-01</p> <p>This article uses life history calendar (LHC) data from coastal Ghana and <span class="hlt">event</span> history statistical methods to examine inter-regional migration for men and women, focusing on four specific migration types: rural-urban, rural-rural, urban-urban, and urban-rural. Our <span class="hlt">analysis</span> is unique because it examines how key determinants of migration— including education, employment, marital status, and childbearing—differ by sex for these four types of migration. We find that women are significantly less mobile than men overall, but that more educated women are more likely to move (particularly to urban areas) than their male counterparts. Moreover, employment in the prior year is less of a deterrent to migration among women. While childbearing has a negative effect on migration, this impact is surprisingly stronger for men than for women, perhaps because women’s search for assistance in childcare promotes migration. Meanwhile, being married or in union appears to have little effect on migration probabilities for either men or women. These results demonstrate the benefits of a LHC approach and suggest that migration research should further examine men’s and women’s mobility as it relates to both human capital and household and family dynamics, particularly in developing settings. PMID:24298203</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1211739D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1211739D"><span id="translatedtitle">Input sensitivity <span class="hlt">analysis</span> of neural network models for flood <span class="hlt">event</span> prediction in ungauged catchments</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dawson, Christian W.; Abrahart, Robert J.</p> <p>2010-05-01</p> <p>Artificial neural networks have now been applied to problems within hydrology for nearly twenty years - primarily in rainfall-runoff modelling and flood forecasting. In recent years the scope of this research has expanded to encompass more theoretical issues and address some of the earlier criticisms of such models - including the internal behaviour of neural networks and the link with physically-based models. While there has been some work on the application of neural network models to predicting flood <span class="hlt">events</span> in ungauged catchments, such research is limited to only a few studies in a handful of regions worldwide. In this paper neural network models are developed using the UK Environment Agency's HiFlows-UK dataset released in 2008. This dataset provides catchment descriptors and annual maximum series for over 900 sites across the UK. The neural network models predict the index flood (median flood) based on four catchment characteristics: area, standard average annual rainfall, index of flood attenuation due to reservoirs and lakes, and baseflow index. These models are assessed using a novel sensitivity <span class="hlt">analysis</span> procedure that is designed to expose the internal relationship that has been implemented between each catchment characteristic and the index flood. Results provide some physical explanation of model behaviour - linking catchment characteristics to the calculated index flood. The results are compared with the FEH QMED mathematical model and with older equivalent models developed on the original FEH data set.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2922775','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2922775"><span id="translatedtitle">Time-Frequency Data Reduction for <span class="hlt">Event</span> Related Potentials: Combining Principal Component <span class="hlt">Analysis</span> and Matching Pursuit</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Aviyente, Selin; Bernat, Edward M.; Malone, Stephen M.; Iacono, William G.</p> <p>2010-01-01</p> <p>Joint time-frequency representations offer a rich representation of <span class="hlt">event</span> related potentials (ERPs) that cannot be obtained through individual time or frequency domain <span class="hlt">analysis</span>. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely-used matching pursuit (MP) approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions. PMID:20730031</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1710094S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1710094S"><span id="translatedtitle">Subtropical influence on January 2009 major sudden stratospheric warming <span class="hlt">event</span>: diagnostic <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schneidereit, Andrea; Peters, Dieter; Grams, Christian; Wolf, Gabriel; Riemer, Michael; Gierth, Franziska; Quinting, Julian; Keller, Julia; Martius, Olivia</p> <p>2015-04-01</p> <p>In January 2009 a major sudden stratospheric warming (MSSW) <span class="hlt">event</span> occurred with the strongest NAM anomaly ever observed at 10 hPa. Also stratospheric Eliassen-Palm flux convergence and zonal mean eddy heat fluxes of ultra-long waves at 100 hPa layer were unusually strong in the mid-latitudes just before and after the onset of the MSSW. Beside internal interactions between the background flow and planetary waves and between planetary waves among themselves the subtropical tropospheric forcing of these enhanced heat fluxes is still an open question. This study investigates in more detail the dynamical reasons for the pronounced heat fluxes based on ERA-Interim re-<span class="hlt">analysis</span> data. Investigating the regional contributions of the eddy heat flux to the northern hemispheric zonal mean revealed a distinct spatial pattern with maxima in the Eastern Pacific/North America and the Eastern North Atlantic/ Europe in that period. The first region is related with an almost persistent tropospheric blocking high (BH) over the Gulf of Alaska dominating the upper-level flow and the second region with a weaker BH over Northern Europe. The evolution of the BH over the Gulf of Alaska can be explained by a chain of tropospheric weather <span class="hlt">events</span> linked to and maintained by subtropical and tropical influences: MJO (phase 7-8) and the developing cold phase of ENSO (La Niña), which are in coherence over the Eastern Pacific favor enhanced subtropical baroclinicity. In turn extratropical cyclone activity increases and shifts more poleward associated with an increase of the frequency of warm conveyor belts (WCB). These WCBs support enhanced poleward directed eddy heat fluxes in Eastern Pacific/North-American region. The Eastern North Atlantic/European positive heat flux anomaly is associated with a blocking high over Scandinavia. This BH is maintained by an eastward propagating Rossby wave train, emanating from the block over the Gulf of Alaska. Eddy feedback processes support this high pressure</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614980B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614980B"><span id="translatedtitle">Assessment of realistic nowcasting lead-times based on predictability <span class="hlt">analysis</span> of Mediterranean Heavy Precipitation <span class="hlt">Events</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bech, Joan; Berenguer, Marc</p> <p>2014-05-01</p> <p>' precipitation forecasts showed some skill (improvement over persistence) for lead times up to 60' for moderate intensities (up to 1 mm in 30') and up to 2.5h for lower rates (above 0.1 mm). However an important <span class="hlt">event-to-event</span> variability has been found as illustrated by the fact that hit rates of rain-no-rain forecasts achieved the 60% value at 90' in the 7 September 2005 and only 40' in the 2 November 2008 case. The discussion of these results provides useful information on the potential application of nowcasting systems and realistic values to be contrasted with specific end-user requirements. This work has been done in the framework of the Hymex research programme and has been partly funded by the ProFEWS project (CGL2010-15892). References Bech J, N Pineda, T Rigo, M Aran, J Amaro, M Gayà, J Arús, J Montanyà, O van der Velde, 2011: A Mediterranean nocturnal heavy rainfall and tornadic <span class="hlt">event</span>. Part I: Overview, damage survey and radar <span class="hlt">analysis</span>. Atmospheric Research 100:621-637 http://dx.doi.org/10.1016/j.atmosres.2010.12.024 Bech J, R Pascual, T Rigo, N Pineda, JM López, J Arús, and M Gayà, 2007: An observational study of the 7 September 2005 Barcelona tornado outbreak. Natural Hazards and Earth System Science 7:129-139 http://dx.doi.org/10.5194/nhess-7-129-2007 Berenguer M, C Corral, R Sa'nchez-Diezma, D Sempere-Torres, 2005: Hydrological validation of a radarbased nowcasting technique. Journal of Hydrometeorology 6: 532-549 http://dx.doi.org/10.1175/JHM433.1 Berenguer M, D Sempere, G Pegram, 2011: SBMcast - An ensemble nowcasting technique to assess the uncertainty in rainfall forecasts by Lagrangian extrapolation. Journal of Hydrology 404: 226-240 http://dx.doi.org/10.1016/j.jhydrol.2011.04.033 Pierce C, A Seed, S Ballard, D Simonin, Z Li, 2012: Nowcasting. In Doppler Radar Observations (J Bech, JL Chau, ed.) Ch. 13, 98-142. InTech, Rijeka, Croatia http://dx.doi.org/10.5772/39054</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2011-11-15/pdf/2011-29436.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2011-11-15/pdf/2011-29436.pdf"><span id="translatedtitle">76 FR 70768 - Common-Cause Failure <span class="hlt">Analysis</span> in <span class="hlt">Event</span> and Condition Assessment: Guidance and Research, Draft...</span></a></p> <p><a target="_blank" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2011-11-15</p> <p>... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure <span class="hlt">Analysis</span> in <span class="hlt">Event</span> and Condition Assessment: Guidance and Research, Draft... November 2, 2011 (76 FR 67764). This action is necessary to correct an erroneous date for submission...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013ISPAr.XL4c..71H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013ISPAr.XL4c..71H"><span id="translatedtitle">Using Web Crawler Technology for Text <span class="hlt">Analysis</span> of Geo-<span class="hlt">Events</span>: A Case Study of the Huangyan Island Incident</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hu, H.; Ge, Y. J.</p> <p>2013-11-01</p> <p>With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-<span class="hlt">events</span> have always faced the bottleneck of traditional manual <span class="hlt">analysis</span> because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political <span class="hlt">events</span>. Based on the framework of Heritrix and the <span class="hlt">analysis</span> of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text <span class="hlt">analysis</span> method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-<span class="hlt">events</span>, the text <span class="hlt">analysis</span> of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political <span class="hlt">events</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=kaplan&pg=6&id=EJ1036084','ERIC'); return false;" href="http://eric.ed.gov/?q=kaplan&pg=6&id=EJ1036084"><span id="translatedtitle">Time-to-<span class="hlt">Event</span> <span class="hlt">Analysis</span> of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Dante, Angelo; Fabris, Stefano; Palese, Alvisa</p> <p>2013-01-01</p> <p>Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-<span class="hlt">event</span> <span class="hlt">analysis</span>, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=recent&pg=2&id=EJ876599','ERIC'); return false;" href="http://eric.ed.gov/?q=recent&pg=2&id=EJ876599"><span id="translatedtitle">Arrests, Recent Life Circumstances, and Recurrent Job Loss for At-Risk Young Men: An <span class="hlt">Event</span>-History <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Wiesner, Margit; Capaldi, Deborah M.; Kim, Hyoun K.</p> <p>2010-01-01</p> <p>This study used longitudinal data from 202 at-risk young men to examine effects of arrests, prior risk factors, and recent life circumstances on job loss across a 7-year period in early adulthood. Repeated failure-time continuous <span class="hlt">event</span>-history <span class="hlt">analysis</span> indicated that occurrence of job loss was primarily related to prior mental health problems,…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21170908','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21170908"><span id="translatedtitle">Robust non-parametric one-sample tests for the <span class="hlt">analysis</span> of recurrent <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia</p> <p>2010-12-30</p> <p>One-sample non-parametric tests are proposed here for inference on recurring <span class="hlt">events</span>. The focus is on the marginal mean function of <span class="hlt">events</span> and the basis for inference is the standardized distance between the observed and the expected number of <span class="hlt">events</span> under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent <span class="hlt">events</span> process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying <span class="hlt">event</span> generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of <span class="hlt">event</span> generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent <span class="hlt">events</span> can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AtmRe.135..415S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AtmRe.135..415S"><span id="translatedtitle">Multi-instrumental <span class="hlt">analysis</span> of large sprite <span class="hlt">events</span> and their producing storm in southern France</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Soula, S.; Iacovella, F.; van der Velde, O.; Montanyà, J.; Füllekrug, M.; Farges, T.; Bór, J.; Georgis, J.-F.; NaitAmor, S.; Martin, J.-M.</p> <p>2014-01-01</p> <p>During the night of 01-02 September, 2009, seventeen distinct sprite <span class="hlt">events</span> including 3 halos were observed above a storm in north-western Mediterranean Sea, with a video camera at Pic du Midi (42.93N; 0.14E; 2877 m). The sprites occurred at distances between 280 and 390 km which are estimated based on their parent CG location. The MCS-type storm was characterized by a trailing-stratiform structure and a very circular shape with a size of about 70,000 km2 (cloud top temperature lower than - 35 °C) when the TLEs were observed. The cloud to ground (CG) flash rate was large (45 min- 1) one hour before the TLE observation and very low (< 5 min- 1) during it. Out of the 17 sprite <span class="hlt">events</span>, 15 parent + CG (P + CG) strokes have been identified and their average peak current is 87 kA (67 kA for the 14 <span class="hlt">events</span> without halo), while the associated charge moment changes (CMC) that could be determined, range from 424 to 2088 ± 20% C km. Several 2-second videos contain multiple sprite <span class="hlt">events</span>: one with four <span class="hlt">events</span>, one with three <span class="hlt">events</span> and three with two <span class="hlt">events</span>. Column and carrot type sprites are identified, either together or separately. All P + CG strokes are clearly located within the stratiform region of the storm and the second P + CG stroke of a multiple <span class="hlt">event</span> is back within the stratiform region. Groups of large and bright carrots reach ~ 70 km height and ~ 80 km horizontal extent. These groups are associated with a second pulse of electric field radiation in the ELF range which occurs ~ 5 ms after the P + CG stroke and exhibits the same polarity, which is evidence for current in the sprite body. VLF perturbations associated with the sprite <span class="hlt">events</span> were recorded with a station in Algiers.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2763866','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2763866"><span id="translatedtitle">Systematic review and meta-<span class="hlt">analysis</span> on the adverse <span class="hlt">events</span> of rimonabant treatment: Considerations for its potential use in hepatology</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2009-01-01</p> <p>Background The cannabinoid-1 receptor blockers have been proposed in the management of obesity and obesity-related liver diseases (fatty liver as NAFLD or NASH). Due to increasing number of patients to be potentially treated and the need to assess the advantage of this treatment in terms of risk/benefit, we analyze the side <span class="hlt">events</span> reported during the treatment with rimonabant by a systematic review and meta-<span class="hlt">analysis</span> of all randomized controlled studies. Methods All published randomized controlled trials using rimonabant versus placebo in adult subjects were retrieved. Relative risks (RR) with 95% confidence interval for relevant adverse <span class="hlt">events</span> and number needed to harm was calculated. Results Nine trials (n = 9635) were considered. Rimonabant 20 mg was associated with an increased risk of adverse <span class="hlt">event</span> (RR 1.35; 95%CI 1.17-1.56), increased discontinuation rate (RR 1.79; 95%CI 1.35-2.38), psychiatric (RR 2.35; 95%CI 1.66-3.34), and nervous system adverse <span class="hlt">events</span> (RR 2.35; 95%CI 1.49-3.70). The number needed to harm for psychiatric adverse <span class="hlt">events</span> is 30. Conclusion Rimonabant is associated with an increased risk of adverse <span class="hlt">events</span>. Despite of an increasing interest for its use on fatty liver, the security profile and efficacy it is needs to be carefully assessed before its recommendation. At present the use of rimonabant on fatty liver cannot be recommended. PMID:19818116</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://ntrs.nasa.gov/search.jsp?R=20070031685&hterms=Wavelet&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3DWavelet','NASA-TRS'); return false;" href="http://ntrs.nasa.gov/search.jsp?R=20070031685&hterms=Wavelet&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3DWavelet"><span id="translatedtitle">Reduced-Order Modeling and Wavelet <span class="hlt">Analysis</span> of Turbofan Engine Structural Response Due to Foreign Object Damage "FOD" <span class="hlt">Events</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Turso, James A.; Lawrence, Charles; Litt, Jonathan S.</p> <p>2007-01-01</p> <p>The development of a wavelet-based feature extraction technique specifically targeting FOD-<span class="hlt">event</span> induced vibration signal changes in gas turbine engines is described. The technique performs wavelet <span class="hlt">analysis</span> of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-<span class="hlt">event</span> detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-<span class="hlt">event</span> test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal <span class="hlt">analysis</span> to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-<span class="hlt">event</span> detection. In the presence of significant noise, precise location of the FOD <span class="hlt">event</span> in time was obtained using the developed wavelet-based feature.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20040110830','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20040110830"><span id="translatedtitle">Reduced-Order Modeling and Wavelet <span class="hlt">Analysis</span> of Turbofan Engine Structural Response Due to Foreign Object Damage (FOD) <span class="hlt">Events</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Turso, James; Lawrence, Charles; Litt, Jonathan</p> <p>2004-01-01</p> <p>The development of a wavelet-based feature extraction technique specifically targeting FOD-<span class="hlt">event</span> induced vibration signal changes in gas turbine engines is described. The technique performs wavelet <span class="hlt">analysis</span> of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-<span class="hlt">event</span> detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-<span class="hlt">event</span> test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal <span class="hlt">analysis</span> to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-<span class="hlt">event</span> detection. In the presence of significant noise, precise location of the FOD <span class="hlt">event</span> in time was obtained using the developed wavelet-based feature.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005GeoJI.163..559D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005GeoJI.163..559D"><span id="translatedtitle">An <span class="hlt">analysis</span> of P times reported in the Reviewed <span class="hlt">Event</span> Bulletin for Chinese underground explosions</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Douglas, A.; O'Mongain, A. M.; Porter, D.; Young, J. B.</p> <p>2005-11-01</p> <p><span class="hlt">Analysis</span> of variance is used to estimate the measurement error and path effects in the P times reported in the Reviewed <span class="hlt">Event</span> Bulletins (REBs, produced by the provisional International Data Center, Arlington, USA) and in times we have read, for explosions at the Chinese Test Site. Path effects are those differences between traveltimes calculated from tables and the true times that result in epicentre error. The main conclusions of the study are: (1) the estimated variance of the measurement error for P times reported in the REB at large signal-to-noise ratio (SNR) is 0.04s2, the bulk of the readings being analyst-adjusted automatic-detections, whereas for our times the variance is 0.01s2 and (2) the standard deviation of the path effects for both sets of observations is about 0.6s. The study shows that measurement error is about twice (~0.2s rather than ~0.1s) and path effects about half the values assumed for the REB times. However, uncertainties in the estimated epicentres are poorly described by treating path effects as a random variable with a normal distribution. Only by estimating path effects and using these to correct onset times can reliable estimates of epicentre uncertainty be obtained. There is currently an international programme to do just this. The results imply that with P times from explosions at three or four stations with good SNR (so that the measurement error is around 0.1s) and well distributed in azimuth, then with correction for path effects the area of the 90 per cent coverage ellipse should be much less than 1000km2-the area allowed for an on-site inspection under the Comprehensive Test Ban Treaty-and should cover the true epicentre with the given probability.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3818028','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3818028"><span id="translatedtitle">Association between triglycerides and cardiovascular <span class="hlt">events</span> in primary populations: a meta-regression <span class="hlt">analysis</span> and synthesis of evidence</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Stauffer, Melissa E; Weisenfluh, Lauren; Morrison, Alan</p> <p>2013-01-01</p> <p>Background Triglyceride levels were found to be independently predictive of the development of primary coronary heart disease in epidemiologic studies. The objective of this study was to determine whether triglyceride levels were predictive of cardiovascular <span class="hlt">events</span> in randomized controlled trials (RCTs) of lipid-modifying drugs. Methods We performed a systematic review and meta-regression <span class="hlt">analysis</span> of 40 RCTs of lipid-modifying drugs with cardiovascular <span class="hlt">events</span> as an outcome. The log of the rate ratio of cardiovascular <span class="hlt">events</span> (eg, coronary death or myocardial infarction) was plotted against the proportional difference between treatment and control groups in triglyceride and other lipid levels (high density lipoprotein cholesterol [HDL-C], low density lipoprotein cholesterol [LDL-C], and total cholesterol) for all trials and for trials of primary and secondary prevention populations. Linear regression was used to determine the statistical significance of the relationship between lipid values and cardiovascular <span class="hlt">events</span>. Results The proportional difference in triglyceride levels was predictive of cardiovascular <span class="hlt">events</span> in all trials (P=0.005 for the slope of the regression line; N=40) and in primary prevention trials (P=0.010; N=11), but not in secondary prevention trials (P=0.114; N=25). The proportional difference in HDL-C was not predictive of cardiovascular <span class="hlt">events</span> in all trials (P=0.822; N=40), or in trials of primary (P=0.223; N=11) or secondary (P=0.487; N=25) prevention. LDL-C levels were predictive of cardiovascular <span class="hlt">events</span> in both primary (P=0.002; N=11) and secondary (P<0.001; N=25) populations. Conclusions Changes in triglyceride levels were predictive of cardiovascular <span class="hlt">events</span> in RCTs. This relationship was significant in primary prevention populations but not in secondary prevention populations. PMID:24204156</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012PhDT.......458S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012PhDT.......458S"><span id="translatedtitle">Tracing footprints of environmental <span class="hlt">events</span> in tree ring chemistry using neutron activation <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sahin, Dagistan</p> <p></p> <p>The aim of this study is to identify environmental effects on tree-ring chemistry. It is known that industrial pollution, volcanic eruptions, dust storms, acid rain and similar <span class="hlt">events</span> can cause substantial changes in soil chemistry. Establishing whether a particular group of trees is sensitive to these changes in soil environment and registers them in the elemental chemistry of contemporary growth rings is the over-riding goal of any Dendrochemistry research. In this study, elemental concentrations were measured in tree-ring samples of absolutely dated eleven modern forest trees, grown in the Mediterranean region, Turkey, collected and dated by the Malcolm and Carolyn Wiener Laboratory for Aegean and Near Eastern Dendrochronology laboratory at Cornell University. Correlations between measured elemental concentrations in the tree-ring samples were analyzed using statistical tests to answer two questions. Does the current concentration of a particular element depend on any other element within the tree? And, are there any elements showing correlated abnormal concentration changes across the majority of the trees? Based on the detailed <span class="hlt">analysis</span> results, the low mobility of sodium and bromine, positive correlations between calcium, zinc and manganese, positive correlations between trace elements lanthanum, samarium, antimony, and gold within tree-rings were recognized. Moreover, zinc, lanthanum, samarium and bromine showed strong, positive correlations among the trees and were identified as possible environmental signature elements. New Dendrochemistry information found in this study would be also useful in explaining tree physiology and elemental chemistry in Pinus nigra species grown in Turkey. Elemental concentrations in tree-ring samples were measured using Neutron Activation <span class="hlt">Analysis</span> (NAA) at the Pennsylvania State University Radiation Science and Engineering Center (RSEC). Through this study, advanced methodologies for methodological, computational and</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.4745P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.4745P"><span id="translatedtitle">Two damaging hydrogeological <span class="hlt">events</span> in Calabria, September 2000 and November 2015. Comparative <span class="hlt">analysis</span> of causes and effects</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela</p> <p>2016-04-01</p> <p>Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative <span class="hlt">analysis</span> between two <span class="hlt">events</span> that affected the southeast sector of the region, in 2000 and 2014, respectively. The <span class="hlt">event</span> occurred between 9th and 10th of September 2000 is known in Italy as Soverato <span class="hlt">event</span>, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent <span class="hlt">event</span> affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the <span class="hlt">event</span> is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The <span class="hlt">event</span> also caused a victim killed by a flood. The <span class="hlt">event</span>-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two <span class="hlt">events</span> that occurred at a temporal distance of 14 years. The comparative <span class="hlt">analysis</span> focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative <span class="hlt">analysis</span> of rain is made by comparing the return period of both daily and</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4916577','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4916577"><span id="translatedtitle">Predictors of seeking emergency medical help during overdose <span class="hlt">events</span> in a provincial naloxone distribution programme: a retrospective <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Ambrose, Graham; Amlani, Ashraf; Buxton, Jane A</p> <p>2016-01-01</p> <p>Objectives This study sought to identify factors that may be associated with help-seeking by witnesses during overdoses where naloxone is administered. Setting Overdose <span class="hlt">events</span> occurred in and were reported from the five regional health authorities across British Columbia, Canada. Naloxone administration forms completed following overdose <span class="hlt">events</span> were submitted to the British Columbia Take Home Naloxone programme. Participants All 182 reported naloxone administration <span class="hlt">events</span>, reported by adult men and women and occurring between 31 August 2012 and 31 March 2015, were considered for inclusion in the <span class="hlt">analysis</span>. Of these, 18 were excluded: 10 <span class="hlt">events</span> which were reported by the person who overdosed, and 8 <span class="hlt">events</span> for which completed forms did not indicate whether or not emergency medical help was sought. Primary and secondary outcome measures Seeking emergency medical help (calling 911), as reported by participants, was the sole outcome measure of this <span class="hlt">analysis</span>. Results Medical help was sought (emergency services—911 called) in 89 (54.3%) of 164 overdoses where naloxone was administered. The majority of administration <span class="hlt">events</span> occurred in private residences (50.6%) and on the street (23.4%), where reported rates of calling 911 were 27.5% and 81.1%, respectively. Overdoses occurring on the street (compared to private residence) were significantly associated with higher odds of calling 911 in multivariate <span class="hlt">analysis</span> (OR=10.68; 95% CI 2.83 to 51.87; p<0.01), after adjusting for other variables. Conclusions Overdoses occurring on the street were associated with higher odds of seeking emergency medical help by responders. Further research is needed to determine if sex and stimulant use by the person who overdosed are associated with seeking emergency medical help. The results of this study will inform interventions within the British Columbia Take Home Naloxone programme and other jurisdictions to encourage seeking emergency medical help. PMID:27329442</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/915532','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/915532"><span id="translatedtitle">A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC <span class="hlt">EVENT</span> <span class="hlt">ANALYSIS</span> I: PROBLEM SELECTION</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson</p> <p>2007-08-01</p> <p>New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/<span class="hlt">analysis</span> team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy <span class="hlt">events</span>, the <span class="hlt">analysis</span> team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/1012895','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/1012895"><span id="translatedtitle">Low Probability Tail <span class="hlt">Event</span> <span class="hlt">Analysis</span> and Mitigation in the BPA Control Area</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Lu, Shuai; Brothers, Alan J.; McKinstry, Craig A.; Jin, Shuangshuang; Makarov, Yuri V.</p> <p>2010-10-31</p> <p>This report investigated the uncertainties with the operations of the power system and their contributions to tail <span class="hlt">events</span>, especially under high penetration of wind. A Bayesian network model is established to quantify the impact of these uncertainties on system imbalance. The framework is presented for a decision support tool, which can help system operators better estimate the need for balancing reserves and prepare for tail <span class="hlt">events</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614364V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614364V"><span id="translatedtitle"><span class="hlt">Analysis</span> of geohazards <span class="hlt">events</span> along Swiss roads from autumn 2011 to present</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri</p> <p>2014-05-01</p> <p>In Switzerland, roads and railways are threatened throughout the year by several natural hazards. Some of these <span class="hlt">events</span> reach transport infrastructure many time per year leading to the closing of transportation corridors, loss of access, deviation travels and sometimes infrastructures damages and loss of human lives (3 fatalities during the period considered). The aim of this inventory of <span class="hlt">events</span> is to investigate the number of natural <span class="hlt">events</span> affecting roads and railways in Switzerland since autumn 2011 until now. Natural hazards affecting roads and railway can be classified in five categories: rockfalls, landslides, debris flows, snow avalanches and floods. They potentially cause several important direct damages on transportation infrastructure (roads, railway), vehicles (slightly or very damaged) or human life (slightly or seriously injured person, death). These direct damages can be easily evaluated from press articles or from Swiss police press releases. Indirect damages such as deviation cost are not taken into account in this work. During the two a half last years, about 50 <span class="hlt">events</span> affecting the Swiss roads and Swiss railways infrastructures were inventoried. The proportion of <span class="hlt">events</span> due to rockfalls is 45%, to landslides 25%, to debris flows 15%, to snow avalanches 10% and to floods 5%. During this period, three fatalities and two persons were injured while 23 vehicles (car, trains and coach) and 24 roads and railways were damaged. We can see that floods occur mainly on the Swiss Plateau whereas rockfalls, debris flow, snow avalanches and landslides are mostly located in the Alpine area. Most of <span class="hlt">events</span> occur on secondary mountain roads and railways. The <span class="hlt">events</span> are well distributed on the whole Alpine area except for the Gotthard hotspot, where an important European North-South motorway (hit in 2003 with two fatalities) and railway (hit three times in 2012 with one fatalities) are more frequently affected. According to the observed <span class="hlt">events</span> in border regions of</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=EVENTS+AND+STANDARDS&pg=2&id=EJ1109003','ERIC'); return false;" href="http://eric.ed.gov/?q=EVENTS+AND+STANDARDS&pg=2&id=EJ1109003"><span id="translatedtitle">Random-Effects Meta-<span class="hlt">Analysis</span> of Time-to-<span class="hlt">Event</span> Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.</p> <p>2013-01-01</p> <p>Meta-<span class="hlt">analysis</span> of time-to-<span class="hlt">event</span> data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-<span class="hlt">analysis</span> of each study in a consistent fashion and thus makes meta-<span class="hlt">analysis</span> of time-to-<span class="hlt">event</span> data feasible. Time-to-<span class="hlt">event</span> data can be…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ClDy...46.1065R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ClDy...46.1065R"><span id="translatedtitle">Non-linear time series <span class="hlt">analysis</span> of precipitation <span class="hlt">events</span> using regional climate networks for Germany</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rheinwalt, Aljoscha; Boers, Niklas; Marwan, Norbert; Kurths, Jürgen; Hoffmann, Peter; Gerstengarbe, Friedrich-Wilhelm; Werner, Peter</p> <p>2016-02-01</p> <p>Synchronous occurrences of heavy rainfall <span class="hlt">events</span> and the study of their relation in time and space are of large socio-economical relevance, for instance for the agricultural and insurance sectors, but also for the general well-being of the population. In this study, the spatial synchronization structure is analyzed as a regional climate network constructed from precipitation <span class="hlt">event</span> series. The similarity between <span class="hlt">event</span> series is determined by the number of synchronous occurrences. We propose a novel standardization of this number that results in synchronization scores which are not biased by the number of <span class="hlt">events</span> in the respective time series. Additionally, we introduce a new version of the network measure directionality that measures the spatial directionality of weighted links by also taking account of the effects of the spatial embedding of the network. This measure provides an estimate of heavy precipitation isochrones by pointing out directions along which rainfall <span class="hlt">events</span> synchronize. We propose a climatological interpretation of this measure in terms of propagating fronts or <span class="hlt">event</span> traces and confirm it for Germany by comparing our results to known atmospheric circulation patterns.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4670968','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4670968"><span id="translatedtitle">Association between use of warfarin with common sulfonylureas and serious hypoglycemic <span class="hlt">events</span>: retrospective cohort <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Romley, John A; Gong, Cynthia; Jena, Anupam B; Goldman, Dana P; Williams, Bradley</p> <p>2015-01-01</p> <p>Study question Is warfarin use associated with an increased risk of serious hypoglycemic <span class="hlt">events</span> among older people treated with the sulfonylureas glipizide and glimepiride? Methods This was a retrospective cohort <span class="hlt">analysis</span> of pharmacy and medical claims from a 20% random sample of Medicare fee for service beneficiaries aged 65 years or older. It included 465 918 beneficiaries with diabetes who filled a prescription for glipizide or glimepiride between 2006 and 2011 (4 355 418 person quarters); 71 895 (15.4%) patients also filled a prescription for warfarin (416 479 person quarters with warfarin use). The main outcome measure was emergency department visit or hospital admission with a primary diagnosis of hypoglycemia in person quarters with concurrent fills of warfarin and glipizide/glimepiride compared with the rates in quarters with glipizide/glimepiride fills only, Multivariable logistic regression was used to adjust for individual characteristics. Secondary outcomes included fall related fracture and altered consciousness/mental status. Summary answer and limitations In quarters with glipizide/glimepiride use, hospital admissions or emergency department visits for hypoglycemia were more common in person quarters with concurrent warfarin use compared with quarters without warfarin use (294/416 479 v 1903/3 938 939; adjusted odds ratio 1.22, 95% confidence interval 1.05 to 1.42). The risk of hypoglycemia associated with concurrent use was higher among people using warfarin for the first time, as well as in those aged 65-74 years. Concurrent use of warfarin and glipizide/glimepiride was also associated with hospital admission or emergency department visit for fall related fractures (3919/416 479 v 20 759/3 938 939; adjusted odds ratio 1.47, 1.41 to 1.54) and altered consciousness/mental status (2490/416 479 v 14 414/3 938 939; adjusted odds ratio 1.22, 1.16 to 1.29). Unmeasured factors could be correlated with both warfarin use and</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612887G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612887G"><span id="translatedtitle">Multivariate spatial <span class="hlt">analysis</span> of a heavy rain <span class="hlt">event</span> in a densely populated delta city</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick</p> <p>2014-05-01</p> <p>Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain <span class="hlt">events</span>. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain <span class="hlt">event</span> triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/10623910','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/10623910"><span id="translatedtitle">Survival <span class="hlt">analysis</span> for recurrent <span class="hlt">event</span> data: an application to childhood infectious diseases.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kelly, P J; Lim, L L</p> <p>2000-01-15</p> <p>Many extensions of survival models based on the Cox proportional hazards approach have been proposed to handle clustered or multiple <span class="hlt">event</span> data. Of particular note are five Cox-based models for recurrent <span class="hlt">event</span> data: Andersen and Gill (AG); Wei, Lin and Weissfeld (WLW); Prentice, Williams and Peterson, total time (PWP-CP) and gap time (PWP-GT); and Lee, Wei and Amato (LWA). Some authors have compared these models by observing differences that arise from fitting the models to real and simulated data. However, no attempt has been made to systematically identify the components of the models that are appropriate for recurrent <span class="hlt">event</span> data. We propose a systematic way of characterizing such Cox-based models using four key components: risk intervals; baseline hazard; risk set, and correlation adjustment. From the definitions of risk interval and risk set there are conceptually seven such Cox-based models that are permissible, five of which are those previously identified. The two new variant models are termed the 'total time - restricted' (TT-R) and 'gap time - unrestricted' (GT-UR) models. The aim of the paper is to determine which models are appropriate for recurrent <span class="hlt">event</span> data using the key components. The models are fitted to simulated data sets and to a data set of childhood recurrent infectious diseases. The LWA model is not appropriate for recurrent <span class="hlt">event</span> data because it allows a subject to be at risk several times for the same <span class="hlt">event</span>. The WLW model overestimates treatment effect and is not recommended. We conclude that PWP-GT and TT-R are useful models for analysing recurrent <span class="hlt">event</span> data, providing answers to slightly different research questions. Further, applying a robust variance to any of these models does not adequately account for within-subject correlation. PMID:10623910</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/18845199','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/18845199"><span id="translatedtitle"><span class="hlt">Analysis</span> of adverse <span class="hlt">events</span> of potential autoimmune aetiology in a large integrated safety database of AS04 adjuvanted vaccines.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Verstraeten, Thomas; Descamps, Dominique; David, Marie-Pierre; Zahaf, Toufik; Hardt, Karin; Izurieta, Patricia; Dubin, Gary; Breuer, Thomas</p> <p>2008-12-01</p> <p>Newly licensed vaccines against human papillomavirus (HPV) and hepatitis B (HBV), and several vaccines in development, including a vaccine against genital herpes simplex virus (HSV), contain a novel Adjuvant System, AS04, composed of 3-O-desacyl-4' monophosphoryl lipid A and aluminium salts. Given the background incidence of autoimmune disorders in some of the groups targeted for immunisation with these vaccines, it is likely that autoimmune <span class="hlt">events</span> will be reported in temporal association with vaccination, even in the absence of a causal relationship. The objective of this integrated <span class="hlt">analysis</span> was to assess safety of AS04 adjuvanted vaccines with regard to adverse <span class="hlt">events</span> (AEs) of potential autoimmune aetiology, particularly in adolescents and young adults. All randomised, controlled trials of HPV-16/18, HSV and HBV vaccines were analysed in an integrated <span class="hlt">analysis</span> of individual data (N = 68,512). A separate <span class="hlt">analysis</span> of the HPV-16/18 vaccine trials alone was also undertaken (N = 39,160). All data were collected prospectively during the vaccine development programmes (mean follow-up of 21.4 months), and included in the <span class="hlt">analysis</span> up to a pre-defined data lock point. Reporting rates of overall autoimmune <span class="hlt">events</span> were around 0.5% and did not differ between the AS04 and control groups. The relative risk (AS04/control) of experiencing any autoimmune <span class="hlt">event</span> was 0.98 (95% confidence intervals 0.80, 1.21) in the integrated <span class="hlt">analysis</span> and 0.92 (0.70, 1.22) in the HPV-16/18 vaccine <span class="hlt">analysis</span>. Relative risks calculated overall, for disease category or for individual <span class="hlt">events</span> were close to 1, and all confidence intervals around the relative risk included 1, indicating no statistically significant difference in <span class="hlt">event</span> rates between the AS04 and control groups. This integrated <span class="hlt">analysis</span> of over 68,000 participants who received AS04 adjuvanted vaccines or controls demonstrated a low rate of autoimmune disorders, without evidence of an increase in relative risk associated with AS04 adjuvanted</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.5079C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.5079C"><span id="translatedtitle">Relationship between catchment <span class="hlt">events</span> (earthquake and heavy rain) and sediment core <span class="hlt">analysis</span> result in Taiwan.</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, Hsin-Ying; Lin, Jiun-Chuan</p> <p>2015-04-01</p> <p>Lake sediments contains material from the catchment. In those sediments, there are some features which can indicate characteristic or status of the catchment. These features were formed by different mechanisms, including some <span class="hlt">events</span> like earthquakes or heavy rain, which are very common in Taiwan. By analyzing and discussing features of sediments there is a chance to identify historical <span class="hlt">events</span> and rebuild catchment history. In this study, we compare features of sediment core ( including density, mineral grain size, whole grain size, and biogenic silica content) and earthquake, precipitation records. Sediment cores are collected from Emerald peak lake (24.514980, 121.605844; 77.5, 77.2, 64cm depth), Liyutan lake (23.959878, 120.996585; 43.2, 78.1 cm depth), Sun Moon Lake (23.847043, 120.909869; 181 cm depth), and Dongyuan lake (22.205742, 120.854984; 45.1, 44.2cm depth) in 2014. We assume that there are regular material and organic output in catchments. And rain will provide impetus to move material into lakes. The greater the rain is the larger the material can move. So, if there is a heavy rainfall <span class="hlt">event</span>, grain size of lake sediment may increase. However, when earthquakes happen, it will produce more material which have lower organic composition than ordinary. So we suggest that after earthquakes there will be more material stored in catchment than often. And rainfall <span class="hlt">event</span> provides power to move material into lakes, cause more sediment and mineral content higher than usual. Comparing with earthquake record(from 1949, by USGS) and precipitation record(from1940, by Central Weather Bureau,Taiwan), there were few earthquakes which happened near lakes and scale were more than 7 ML. There were 28 rainfall <span class="hlt">events</span> near Emerald peak lake; 32 near Liyutan lake and Sun Moon Lake; 58 near Dongyuan lake ( rainfall <span class="hlt">event</span>: >250 mm/day ). In sediment analytical results, ratio of whole and mineral grain size indeed have similar trends with earthquake record. However, rainfall</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.8259B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.8259B"><span id="translatedtitle">Towards a unified study of extreme <span class="hlt">events</span> using universality concepts and transdisciplinary <span class="hlt">analysis</span> methods</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Balasis, George; Donner, Reik V.; Donges, Jonathan F.; Radebach, Alexander; Eftaxias, Konstantinos; Kurths, Jürgen</p> <p>2013-04-01</p> <p>The dynamics of many complex systems is characterized by the same universal principles. In particular, systems which are otherwise quite different in nature show striking similarities in their behavior near tipping points (bifurcations, phase transitions, sudden regime shifts) and associated extreme <span class="hlt">events</span>. Such critical phenomena are frequently found in diverse fields such as climate, seismology, or financial markets. Notably, the observed similarities include a high degree of organization, persistent behavior, and accelerated energy release, which are common to (among others) phenomena related to geomagnetic variability of the terrestrial magnetosphere (intense magnetic storms), seismic activity (electromagnetic emissions prior to earthquakes), solar-terrestrial physics (solar flares), neurophysiology (epileptic seizures), and socioeconomic systems (stock market crashes). It is an open question whether the spatial and temporal complexity associated with extreme <span class="hlt">events</span> arises from the system's structural organization (geometry) or from the chaotic behavior inherent to the nonlinear equations governing the dynamics of these phenomena. On the one hand, the presence of scaling laws associated with earthquakes and geomagnetic disturbances suggests understanding these <span class="hlt">events</span> as generalized phase transitions similar to nucleation and critical phenomena in thermal and magnetic systems. On the other hand, because of the structural organization of the systems (e.g., as complex networks) the associated spatial geometry and/or topology of interactions plays a fundamental role in the emergence of extreme <span class="hlt">events</span>. Here, a few aspects of the interplay between geometry and dynamics (critical phase transitions) that could result in the emergence of extreme <span class="hlt">events</span>, which is an open problem, will be discussed.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17177494','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17177494"><span id="translatedtitle"><span class="hlt">Event</span>-specific qualitative and quantitative polymerase chain reaction <span class="hlt">analysis</span> for genetically modified canola T45.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Yang, Litao; Pan, Aihu; Zhang, Haibo; Guo, Jinchao; Yin, Changsong; Zhang, Dabing</p> <p>2006-12-27</p> <p>Polymerase chain reaction (PCR) methods have been the main technical support for the detection of genetically modified organisms (GMOs). To date, GMO-specific PCR detection strategies have been developed basically at four different levels, such as screening-, gene-, construct-, and <span class="hlt">event</span>-specific detection methods. <span class="hlt">Event</span>-specific PCR detection method is the primary trend in GMO detection because of its high specificity based on the flanking sequence of exogenous integrant. GM canola, <span class="hlt">event</span> T45, with tolerance to glufosinate ammonium is one of the commercial genetically modified (GM) canola <span class="hlt">events</span> approved in China. In this study, the 5'-integration junction sequence between host plant DNA and the integrated gene construct of T45 canola was cloned and revealed by means of TAIL-PCR. Specific PCR primers and TaqMan probes were designed based upon the revealed sequence, and qualitative and quantitative TaqMan real-time PCR detection assays employing these primers and probe were developed. In qualitative PCR, the limit of detection (LOD) was 0.1% for T45 canola in 100 ng of genomic DNA. The quantitative PCR assay showed limits of detection and quantification (LOD and LOQ) of 5 and 50 haploid genome copies, respectively. In addition, three mixed canola samples with known GM contents were detected employing the developed real-time PCR assay, and expected results were obtained. These results indicated that the developed <span class="hlt">event</span>-specific PCR methods can be used for identification and quantification of T45 canola and its derivates.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26409433','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26409433"><span id="translatedtitle">Consensus <span class="hlt">analysis</span> of networks with time-varying topology and <span class="hlt">event</span>-triggered diffusions.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Han, Yujuan; Lu, Wenlian; Chen, Tianping</p> <p>2015-11-01</p> <p>This paper studies the consensus problem of networks with time-varying topology. <span class="hlt">Event</span>-triggered rules are employed in diffusion coupling terms to reduce the updating load of the coupled system. Two strategies are considered: <span class="hlt">event</span>-triggered strategy, that each node observes the state information in an instantaneous way, to determine the next triggering <span class="hlt">event</span> time, and self-triggered strategy, that each node only needs to observe the state information at the <span class="hlt">event</span> time to predict the next triggering <span class="hlt">event</span> time. In each strategy, two kinds of algorithms are considered: the pull-based algorithm, that the diffusion coupling term of every node is updated at the latest observations of the neighborhood at its triggered time, and push-based algorithm, the diffusion coupling term of every node uses the state information of its neighborhood at their latest triggered time. It is proved that if the coupling matrix across time intervals with length less than some given constant has spanning trees, then the proposed algorithms can realize consensus. Examples with numerical simulation are provided to show the effectiveness of the theoretical results.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2015EGUGA..17.7215K&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2015EGUGA..17.7215K&link_type=ABSTRACT"><span id="translatedtitle">Definition of Stratospheric Sudden Warming <span class="hlt">Events</span> for Multi-Model <span class="hlt">Analysis</span> and Its Application to the CMIP5</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Junsu; Son, Seok-Woo; Park, Hyo-Seok</p> <p>2015-04-01</p> <p>The onset of major stratospheric sudden warming (SSW) <span class="hlt">events</span> has been often defined as the date when the westerly at 10 hPa and 60°N turns to easterly during winter, corresponding to warmer polar stratosphere than mid latitudes. This simple definition effectively detects the observed characteristics of SSW, but its application to climate models, which have different background flow and temporal variability, is often challenging. For example, the model whose stratospheric mean wind is too weak tends to overestimate the frequency of zonal-wind reversal and SSW <span class="hlt">events</span>. In this study we propose a simple definition of major SSW <span class="hlt">events</span> that is applicable to multi-model <span class="hlt">analysis</span>. Specifically, SSW <span class="hlt">events</span> are defined when the tendency of zonal-mean zonal wind at 10 hPa at 60°N crosses -1 m/s/day within 30 to 40 days while growing in magnitude. This tendency-based definition, which is independent of mean wind, is applied to both ERA40 reanalysis and CMIP5 models. The models are further grouped into the high-top models with a well-resolved stratosphere and low-top models with a relatively simple stratosphere. A new definition successfully reproduces the mean frequency of SSW <span class="hlt">events</span> that is identified by wind reversal approach, i.e., about 6 <span class="hlt">events</span> per decade in ERA40. High-top models well capture this frequency. Although low-top models underestimate the frequency, in contrast to previous studies, the difference to high-top models is not statistically significant. Likewise, no significant difference is found in the downward coupling in the high-top and low-top models. These results indicate that model vertical resolution itself may not be a key factor in simulating SSW <span class="hlt">events</span> and the associated downward coupling.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/24403259','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/24403259"><span id="translatedtitle">Therapeutic potential and adverse <span class="hlt">events</span> of everolimus for treatment of hepatocellular carcinoma - systematic review and meta-<span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Yamanaka, Kenya; Petrulionis, Marius; Lin, Shibo; Gao, Chao; Galli, Uwe; Richter, Susanne; Winkler, Susanne; Houben, Philipp; Schultze, Daniel; Hatano, Etsuro; Schemmer, Peter</p> <p>2013-12-01</p> <p>Everolimus is an orally administrated mammalian target of rapamycin (mTOR) inhibitor. Several large-scale randomized controlled trials (RCTs) have demonstrated the survival benefits of everolimus at the dose of 10 mg/day for solid cancers. Furthermore, mTOR-inhibitor-based immunosuppression is associated with survival benefits for patients with hepatocellular carcinoma (HCC) who have received liver transplantation. However, a low rate of tumor reduction and some adverse <span class="hlt">events</span> have been pointed out. This review summarizes the antitumor effects and adverse <span class="hlt">events</span> of everolimus and evaluates its possible application in advanced HCC. For the meta-<span class="hlt">analysis</span> of adverse <span class="hlt">events</span>, we used the RCTs for solid cancers. The odds ratios of adverse <span class="hlt">events</span> were calculated using the Peto method. Manypreclinical studies demonstrated that everolimus had antitumor effects such as antiproliferation and antiangiogenesis. However, some differences in the effects were observed among in vivo animal studies for HCC treatment. Meanwhile, clinical studies demonstrated that the response rate of single-agent everolimus was low, though survival benefits could be expected. The meta-<span class="hlt">analysis</span> revealed the odds ratios (95% confidence interval [CI]) of stomatitis: 5.42 [4.31-6.73], hyperglycemia: 3.22 [2.37-4.39], anemia: 3.34 [2.37-4.67], pneumonitis: 6.02 [3.95-9.16], aspartate aminotransferase levels: 2.22 [1.37-3.62], and serum alanine aminotransferase levels: 2.94 [1.72-5.02], respectively. Everolimus at the dose of 10 mg/day significantly increased the risk of the adverse <span class="hlt">events</span>. In order to enable its application to the standard conventional therapies of HCC, further studies are required to enhance the antitumor effects and manage the adverse <span class="hlt">events</span> of everolimus. PMID:24403259</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMNH51C1910C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMNH51C1910C"><span id="translatedtitle">Inverse modeling of storm intensity based on grain-size <span class="hlt">analysis</span> of hurricane-induced <span class="hlt">event</span> beds</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Castagno, K. A.; Donnelly, J. P.</p> <p>2015-12-01</p> <p>As the coastal population continues to grow in size and wealth, increased hurricane frequency and intensity present a growing threat of property damage and loss of life. Recent reconstructions of past intense-hurricane landfalls from sediment cores in southeastern New England identify a series of active intervals over the past 2,000 years, with the last few centuries among the most quiescent intervals. The frequency of intense-hurricane landfalls in southeastern New England is well constrained, but the intensity of these storms, particularly prehistoric <span class="hlt">events</span>, is not. This study analyzes the grain sizes of major storm <span class="hlt">event</span> beds along a transect of sediment cores in Salt Pond, Falmouth, MA. Several prehistoric <span class="hlt">events</span> contain more coarse material than any of the deposits from the historical interval, suggesting that landfalling hurricanes in the northeastern United States may have been more intense than the historically encountered category 2 and 3 storms. The intensity of major storm <span class="hlt">events</span> is estimated using grain-size <span class="hlt">analysis</span> with a digital image processing, size, and shape analyzer. Since <span class="hlt">event</span> deposits in Salt Pond result from a combination of coastal inundation and wave action, a large population of both historical and synthetic storms is used to assess the storm characteristics that could result in the wave heights inversely modeled from grain size trends. Intense-hurricane activity may be closely tied to warming in sea surface temperature. As such, the prehistoric intervals of increased frequency and intensity provide potential analogs for current and future hurricane risk in the northeastern United States.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5063115','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5063115"><span id="translatedtitle">Life <span class="hlt">events</span>, anxiety, social support, personality, and alexithymia in female patients with chronic pain: A path <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zeng, Fanmin; Yang, Bangxiang; Fu, Xiaoqian</p> <p>2015-01-01</p> <p>Abstract Introduction This study sought to identify a model that explains the relationship between psychosocial factors and chronic pain in female patients, and to explore all of these constructs in a single study and provide a more holistic examination of the overall psychosocial factors that female patients with chronic pain encounter. Methods Female patients with chronic pain (n = 147), aged 20–65 (M = 34.9 years, SD = 11.25), from an outpatient pain clinic completed a cross‐sectional self‐report questionnaire on anxiety, life <span class="hlt">events</span>, personality, social support, and alexithymia. Data were analyzed by means of path <span class="hlt">analysis</span>. Results The direct effect of anxiety on female patients with chronic pain was greatest among all the paths. Personality and alexithymia led to chronic pain in female patients only indirectly, mediated by life <span class="hlt">events</span>. The personality factors of neuroticism and extraversion were associated positively with social support, which had an indirect effect on the influence of life <span class="hlt">events</span> on chronic pain. However, alexithymia was associated negatively with social support, which had an indirect effect on the influence of life <span class="hlt">events</span> on chronic pain. Discussion Our findings provide evidence that life <span class="hlt">events</span> are a mediator in the relationship between personality, social support, alexithymia, and chronic pain in female patients. PMID:26568558</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812396B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812396B"><span id="translatedtitle">Multiple <span class="hlt">event</span> location <span class="hlt">analysis</span> of aftershock sequences in the Pannonian basin</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bekesi, Eszter; Sule, Balint; Bondar, Istvan</p> <p>2016-04-01</p> <p>Accurate seismic <span class="hlt">event</span> location is crucial to understand tectonic processes such as crustal faults that are most commonly investigated by studying seismic activity. Location errors can be significantly reduced using multiple <span class="hlt">event</span> location methods. We applied the double difference method to relocate the earthquake occurred near Oroszlány and its 200 aftershocks to identify the geometry of the related fault. We used the extended ISC location algorithm, iLoc to determine the absolute single <span class="hlt">event</span> locations for the Oroszlány aftershock sequence and applied double difference algorithm on the new hypocenters. To improve location precision, we added differential times from waveform cross-correlation to the multiple <span class="hlt">event</span> location process to increase the accuracy of arrival time readings. We also tested the effect of various local 1-D velocity models on the results. We compared hypoDD results of bulletin and iLoc hypocenters to investigate the effect of initial hypocenter parameters on the relocation process. We show that hypoDD collapses the initial, rather diffuse locations into a smaller cluster and the vertical cross-sections show sharp images of seismicity. Unsurprisingly, the combined use of catalog and cross-correlation data sets provides the more accurate locations. Some of the relocated <span class="hlt">events</span> in the cluster are ground truth quality with a location accuracy of 5 km or better. Having achieved accurate locations for the <span class="hlt">event</span> cluster we are able to resolve the fault plane ambiguity in the moment tensor solutions and determine the accurate strike of the fault.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ascl.soft03004M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ascl.soft03004M"><span id="translatedtitle">gPhoton: Time-tagged GALEX photon <span class="hlt">events</span> <span class="hlt">analysis</span> tools</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.</p> <p>2016-03-01</p> <p>Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon <span class="hlt">events</span> detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these <span class="hlt">events</span> in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11.7089K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11.7089K"><span id="translatedtitle">Impacts of extreme temperature <span class="hlt">events</span> on mortality: <span class="hlt">analysis</span> over individual seasons</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kysely, J.; Plavcova, E.; Kyncl, J.; Kriz, B.; Pokorna, L.</p> <p>2009-04-01</p> <p>Extreme temperature <span class="hlt">events</span> influence human society in many ways, including impacts on morbidity and mortality. While the effects of hot summer periods are relatively direct in mid-latitudinal regions, much less is known and little consensus has been achieved about possible consequences of both positive and negative temperature extremes in other parts of year. The study examines links between spells of hot and cold temperature anomalies and daily all-cause (total) mortality and mortality due to cardiovascular diseases in the population of the Czech Republic (central Europe) in individual seasons (DJF, MAM, JJA, SON). The datasets cover the period 1986-2006. Hot (cold) spells are defined in terms of anomalies of average daily temperature from the mean annual cycle as periods of at least 2 successive days on which the anomalies are above (below) the 95% (5%) quantile of the empirical distribution of the anomalies. Excess daily mortality is established by calculating deviations of the observed number of deaths and the expected number of deaths, which takes into account effects of long-term changes in mortality and the annual cycle. Periods when mortality is affected by influenza and acute respiratory infection outbreaks have been identified and excluded from the datasets before the <span class="hlt">analysis</span>. The study is carried out for several population groups in order to identify dependence of the mortality impacts on age and gender; in particular, we focus on differences in the impacts on the elderly (70+ yrs) and younger age groups (0-69 yrs). Although results for hot- and cold-related mortality are less conclusive in the other seasons outside summer, significant links are found in several cases. The <span class="hlt">analysis</span> reveals that - the largest effects of either hot or cold spells are observed for hot spells in JJA, with a 14% (16%) increase in mortality for the 1-day lag for all ages (70+ yrs); - much smaller but still significant effects are associated with hot spells in MAM; - the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=placebo&pg=2&id=EJ788818','ERIC'); return false;" href="http://eric.ed.gov/?q=placebo&pg=2&id=EJ788818"><span id="translatedtitle">Meta-<span class="hlt">Analysis</span> of Suicide-Related Behavior <span class="hlt">Events</span> in Patients Treated with Atomoxetine</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Bangs, Mark E.; Tauscher-Wisniewski, Sitra; Polzer, John; Zhang, Shuyu; Acharya, Nayan; Desaiah, Durisala; Trzepacz, Paula T.; Allen, Albert J.</p> <p>2008-01-01</p> <p>A study to examine suicide-related <span class="hlt">events</span> in acute, double-blind, and placebo controlled trials with atomoxetine is conducted. Results conclude that the incidences of suicide were more frequent in children suffering from ADHD treated with atomoxetine as compared to those treated with placebo.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.6560C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.6560C"><span id="translatedtitle">Exploratory <span class="hlt">analysis</span> of rainfall <span class="hlt">events</span> in Coimbra, Portugal: variability of raindrop characteristics</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Carvalho, S. C. P.; de Lima, M. I. P.; de Lima, J. L. M. P.</p> <p>2012-04-01</p> <p>Laser disdrometers can monitor efficiently rainfall characteristics at small temporal scales, providing data on rain intensity, raindrop diameter and fall speed, and raindrop counts over time. This type of data allows for the increased understanding of the rainfall structure at small time scales. Of particular interest for many hydrological applications is the characterization of the properties of extreme <span class="hlt">events</span>, including the intra-<span class="hlt">event</span> variability, which are affected by different factors (e.g. geographical location, rainfall generating mechanisms). These properties depend on the microphysical, dynamical and kinetic processes that interact to produce rain. In this study we explore rainfall data obtained during two years with a laser disdrometer installed in the city of Coimbra, in the centre region of mainland Portugal. The equipment was developed by Thies Clima. The data temporal resolution is one-minute. Descriptive statistics of time series of raindrop diameter (D), fall speed, kinetic energy, and rain rate were studied at the <span class="hlt">event</span> scale; for different variables, the average, maximum, minimum, median, variance, standard deviation, quartile, coefficient of variation, skewness and kurtosis were determined. The empirical raindrop size distribution, N(D), was also calculated. Additionally, the parameterization of rainfall was attempted by investigating the applicability of different theoretical statistical distributions to fit the empirical data (e.g. exponential, gamma and lognormal distributions). As expected, preliminary results show that rainfall properties and structure vary with rainfall type and weather conditions over the year. Although only two years were investigated, already some insight into different rain <span class="hlt">events</span>' structure was obtained.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=extreme+AND+event&pg=4&id=EJ792171','ERIC'); return false;" href="http://eric.ed.gov/?q=extreme+AND+event&pg=4&id=EJ792171"><span id="translatedtitle">Descriptive <span class="hlt">Analysis</span> of Classroom Setting <span class="hlt">Events</span> on the Social Behaviors of Children with Autism Spectrum Disorder</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Boyd, Brian A.; Conroy, Maureen A.; Asmus, Jennifer M.; McKenney, Elizabeth L. W.; Mancil, G. Richmond</p> <p>2008-01-01</p> <p>Children with Autism Spectrum Disorder (ASD) are characterized by extreme deficits in social relatedness with same-age peers. The purpose of this descriptive study was to identify naturally occurring antecedent variables (i.e., setting <span class="hlt">events</span>) in the classroom environments of children with ASD that promoted their engagement in peer-related social…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=parental+AND+conflict+AND+child+AND+well&id=EJ959041','ERIC'); return false;" href="http://eric.ed.gov/?q=parental+AND+conflict+AND+child+AND+well&id=EJ959041"><span id="translatedtitle">Parental Separation and Child Aggressive and Internalizing Behavior: An <span class="hlt">Event</span> History Calendar <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Averdijk, Margit; Malti, Tina; Eisner, Manuel; Ribeaud, Denis</p> <p>2012-01-01</p> <p>This study investigated the relationship between parental separation and aggressive and internalizing behavior in a large sample of Swiss children drawn from the ongoing Zurich Project on the Social Development of Children and Youths. Parents retrospectively reported life <span class="hlt">events</span> and problem behavior for the first 7 years of the child's life on a…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=willpower+OR+self+AND+control&pg=3&id=EJ980651','ERIC'); return false;" href="http://eric.ed.gov/?q=willpower+OR+self+AND+control&pg=3&id=EJ980651"><span id="translatedtitle">Further <span class="hlt">Analysis</span> of Variables That Affect Self-Control with Aversive <span class="hlt">Events</span></span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Perrin, Christopher J.; Neef, Nancy A.</p> <p>2012-01-01</p> <p>The purpose of this study was to examine variables that affect self-control in the context of academic task completion by elementary school children with autism. In the baseline assessment of Study 1, mathematics problem completion was shown to be an aversive <span class="hlt">event</span>, and sensitivity to task magnitude, task difficulty, and delay to task completion…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016DyAtO..75...22Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016DyAtO..75...22Y"><span id="translatedtitle">An <span class="hlt">analysis</span> of extreme intraseasonal rainfall <span class="hlt">events</span> during January-March 2010 over eastern China</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yao, Suxiang; Huang, Qian</p> <p>2016-09-01</p> <p>The precipitation over eastern China during January-March 2010 exhibited a marked intraseasonal oscillation (ISO) and a dominant period of 10-60 days. There were two active intraseasonal rainfall periods. The physical mechanisms responsible for the onset of the two rainfall <span class="hlt">events</span> were investigated using ERA-interim data. In the first ISO <span class="hlt">event</span>, anomalous ascending motion was triggered by vertically integrated (1000-300 hPa) warm temperature advection. In addition to southerly anomalies on the intraseasonal (10-60-day) timescale, synoptic-scale southeasterly winds helped advect warm air from the South China Sea and western Pacific into the rainfall region. In the second ISO <span class="hlt">event</span>, anomalous convection was triggered by a convectively unstable stratification, which was caused primarily by anomalous moisture advection in the lower troposphere (1000-850 hPa) from the Bay of Bengal and the Indo-China Peninsula. Both the intraseasonal and the synoptic winds contributed to the anomalous moisture advection. Therefore, the winter intraseasonal rainfall <span class="hlt">events</span> over East Asia in winter could be affected not only by intraseasonal activities but also by higher frequency disturbances.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016SPD....4710101R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016SPD....4710101R"><span id="translatedtitle">Transition Region Explosive <span class="hlt">Events</span> in He II 304Å: Observation and <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rust, Thomas; Kankelborg, Charles C.</p> <p>2016-05-01</p> <p>We present examples of transition region explosive <span class="hlt">events</span> observed in the He II 304Å spectral line with the Multi Order Solar EUV Spectrograph (MOSES). With small (<5000 km) spatial scale and large non-thermal (100-150 km/s) velocities these <span class="hlt">events</span> satisfy the observational signatures of transition region explosive <span class="hlt">events</span>. Derived line profiles show distinct blue and red velocity components with very little broadening of either component. We observe little to no emission from low velocity plasma, making the plasmoid instability reconnection model unlikely as the plasma acceleration mechanism for these <span class="hlt">events</span>. Rather, the single speed, bi-directional jet characteristics suggested by these data are consistent with acceleration via Petschek reconnection.Observations were made during the first sounding rocket flight of MOSES in 2006. MOSES forms images in 3 orders of a concave diffraction grating. Multilayer coatings largely restrict the passband to the He II 303.8Å and Si XI 303.3Å spectral lines. The angular field of view is about 8.5'x17', or about 20% of the solar disk. These images constitute projections of the volume I(x,y,λ), the intensity as a function of sky plane position and wavelength. Spectral line profiles are recovered via tomographic inversion of these projections. Inversion is carried out using a multiplicative algebraic reconstruction technique.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/23322764','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/23322764"><span id="translatedtitle">Automated detection of instantaneous gait <span class="hlt">events</span> using time frequency <span class="hlt">analysis</span> and manifold embedding.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aung, Min S H; Thies, Sibylle B; Kenney, Laurence P J; Howard, David; Selles, Ruud W; Findlow, Andrew H; Goulermas, John Y</p> <p>2013-11-01</p> <p>Accelerometry is a widely used sensing modality in human biomechanics due to its portability, non-invasiveness, and accuracy. However, difficulties lie in signal variability and interpretation in relation to biomechanical <span class="hlt">events</span>. In walking, heel strike and toe off are primary gait <span class="hlt">events</span> where robust and accurate detection is essential for gait-related applications. This paper describes a novel and generic <span class="hlt">event</span> detection algorithm applicable to signals from tri-axial accelerometers placed on the foot, ankle, shank or waist. Data from healthy subjects undergoing multiple walking trials on flat and inclined, as well as smooth and tactile paving surfaces is acquired for experimentation. The benchmark timings at which heel strike and toe off occur, are determined using kinematic data recorded from a motion capture system. The algorithm extracts features from each of the acceleration signals using a continuous wavelet transform over a wide range of scales. A locality preserving embedding method is then applied to reduce the high dimensionality caused by the multiple scales while preserving salient features for classification. A simple Gaussian mixture model is then trained to classify each of the time samples into heel strike, toe off or no <span class="hlt">event</span> categories. Results show good detection and temporal accuracies for different sensor locations and different walking terrains. PMID:23322764</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4295038','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4295038"><span id="translatedtitle">Plasma properties from the multi-wavelength <span class="hlt">analysis</span> of the November 1st 2003 CME/shock <span class="hlt">event</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Benna, Carlo; Mancuso, Salvatore; Giordano, Silvio; Gioannini, Lorenzo</p> <p>2012-01-01</p> <p>The <span class="hlt">analysis</span> of the spectral properties and dynamic evolution of a CME/shock <span class="hlt">event</span> observed on November 1st 2003 in white-light by the LASCO coronagraph and in the ultraviolet by the UVCS instrument operating aboard SOHO, has been performed to compute the properties of some important plasma parameters in the middle corona below about 2R⊙. Simultaneous observations obtained with the MLSO/Mk4 white-light coronagraph, providing both the early evolution of the CME expansion in the corona and the pre-shock electron density profile along the CME front, were also used to study this <span class="hlt">event</span>. By combining the above information with the <span class="hlt">analysis</span> of the metric type II radio emission detected by ground-based radio spectrographs, we finally derive estimates of the values of the local Alfvén speed and magnetic field strength in the solar corona. PMID:25685432</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMDI21A2598K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMDI21A2598K"><span id="translatedtitle">Applicability of the Multiple-<span class="hlt">event</span> Stacking Technique for Shear-wave Splitting <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kong, F.; Gao, S. S.; Liu, K. H.</p> <p>2015-12-01</p> <p>For several decades, shear wave splitting (SWS) parameters (fast polarization orientations and splitting times) have been widely measured to reveal the orientation and strength of mantle anisotropy. One of the most popularly used techniques for obtaining station averaged SWS parameters is the multiple-<span class="hlt">event</span> stacking technique (MES). Results from previous studies suggest that the splitting times obtained using MES are frequently smaller than those derived from simple averaging of splitting times obtained using the <span class="hlt">event</span>-specific technique of Silver and Chan (1991) (SC). To confirm such apparent discrepancies between the two popularly used methods and to explore the causes, we conduct numerical experiments using both synthetic and observed data. The results show that when the anisotropic structure can be represented by a horizontal single layer of anisotropy with constant or spatially varying splitting times, MES can accurately retrieve the splitting parameters. However, when the fast orientations or both splitting parameters vary azimuthally due to lateral heterogeneities or double-layer anisotropy, the station averaged fast orientations from MES and SC are mostly comparable, but the splitting times obtained using MES are underestimated. For laterally varying fast orientations in the vicinity of a station, the magnitude of the underestimation is dependent on the arriving azimuth of the <span class="hlt">events</span> participated in the stacking; for two-layer models of anisotropy, the resulting splitting parameters using MES are biased towards those of the top layer, due to the dominance of <span class="hlt">events</span> with a back azimuth parallel or orthogonal to the fast orientation of the lower layer. Obviously, MES can still be applied in areas with complex or spatially varying anisotropy to obtain reliable results by stacking <span class="hlt">events</span> from narrow back-azimuthal windows, especially when limited amounts of high-quality data are present.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFM.A23C0178M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFM.A23C0178M"><span id="translatedtitle"><span class="hlt">Analysis</span> of Extreme <span class="hlt">Events</span> in Regional Climate Model Simulations for the Pacific Northwest using weatherathome</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mera, R. J.; Mote, P.; Weber, J.</p> <p>2011-12-01</p> <p>One of the most prominent impacts of climate change over the Pacific Northwest is the potential for an elevated number of extreme precipitation <span class="hlt">events</span> over the region. Planning for natural hazards such as increasing number of floods related to high-precipitation <span class="hlt">events</span> have, in general, focused on avoiding development in floodplains and conditioning development to withstand inundation with a minimum of losses. Nationwide, the Federal Emergency Management Agency (FEMA) estimates that about one quarter of its payments cover damage that has occurred outside mapped floodplains. It is clear that traditional flood-based planning will not be sufficient to predict and avoid future losses resulting from climate-related hazards such as high-precipitation <span class="hlt">events</span>. In order to address this problem, the present study employs regional climate model output for future climate change scenarios to aid with the development of a map-based inventory of future hazard risks that can contribute to the development of a "planning-scale" decision support system for the Oregon Department of Land Conservation and Development (DLCD). Climate model output is derived from the climateprediction.net (CPDN) weatherathome project, an innovative climate science experiment that utilizes volunteer computers from users worldwide to produce hundreds of thousands superensembles of regional climate simulations of the Western United States climate from 1950 to 2050. The spatial and temporal distribution of extreme weather <span class="hlt">events</span> are analyzed for the Pacific Northwest to diagnose the model's capabilities as an input for map products such as impacts on hydrology. Special attention is given to intensity and frequency of Atmospheric River <span class="hlt">events</span> in historical and future climate contexts.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1712643E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1712643E"><span id="translatedtitle">Geohazard assessment through the <span class="hlt">analysis</span> of historical alluvial <span class="hlt">events</span> in Southern Italy</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Esposito, Eliana; Violante, Crescenzo</p> <p>2015-04-01</p> <p>The risk associated with extreme water <span class="hlt">events</span> such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood <span class="hlt">events</span> over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow <span class="hlt">events</span> have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm <span class="hlt">events</span> typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic <span class="hlt">events</span> repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26516102','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26516102"><span id="translatedtitle">Adverse <span class="hlt">events</span> and treatment failure leading to discontinuation of recently approved antipsychotic drugs in schizophrenia: A network meta-<span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto</p> <p>2015-12-01</p> <p>Objective:We aimed to gather evidence of the discontinuation rates owing to adverse <span class="hlt">events</span> or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-<span class="hlt">analysis</span> and mixed treatment comparison meta <span class="hlt">analysis</span>(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse <span class="hlt">events</span> and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse <span class="hlt">events</span>, whether in pairwise meta-<span class="hlt">analysis</span> or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-<span class="hlt">analysis</span> and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse <span class="hlt">events</span>. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/25673245','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/25673245"><span id="translatedtitle">Development and application of a multi-targeting reference plasmid as calibrator for <span class="hlt">analysis</span> of five genetically modified soybean <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao</p> <p>2015-04-01</p> <p>Reference materials are important in accurate <span class="hlt">analysis</span> of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven <span class="hlt">event</span>-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR <span class="hlt">analysis</span>, the PCR efficiencies of all <span class="hlt">event</span>-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples <span class="hlt">analysis</span>. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean <span class="hlt">events</span>. PMID:25673245</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25673245','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25673245"><span id="translatedtitle">Development and application of a multi-targeting reference plasmid as calibrator for <span class="hlt">analysis</span> of five genetically modified soybean <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao</p> <p>2015-04-01</p> <p>Reference materials are important in accurate <span class="hlt">analysis</span> of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven <span class="hlt">event</span>-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR <span class="hlt">analysis</span>, the PCR efficiencies of all <span class="hlt">event</span>-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples <span class="hlt">analysis</span>. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean <span class="hlt">events</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2956185','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2956185"><span id="translatedtitle">The Use of Qualitative Comparative <span class="hlt">Analysis</span> for Critical <span class="hlt">Event</span> Research in Alcohol and HIV in Mumbai, India</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chandran, Devyani; Singh, S. K.; Berg, Marlene; Singh, Sharad; Gupta, Kamla</p> <p>2010-01-01</p> <p>In this paper we use Qualitative Comparative <span class="hlt">Analysis</span> (QCA) in critical <span class="hlt">event</span> <span class="hlt">analysis</span> to identify under what conditions alcohol is necessary in contributing to unprotected sex. The paper is based on a set of in-depth interviews with 84 men aged 18 = 29 from three typical low income communities in Mumbai who reported using alcohol and having sex with at least one nonspousal partner once or more in the 30 days prior to the interview. The interviews included narratives of critical <span class="hlt">events</span> defined as recent (past 30–60 day) <span class="hlt">events</span> involving sexual behavior with or without alcohol. The paper identifies themes related to alcohol, sexuality and condom use, uses QCA to identify and explain configurations leading to protected and unprotected sex, and explains the differences. The <span class="hlt">analysis</span> shows that alcohol alone is not sufficient to explain any cases involving unprotected sex but alcohol in combination with partner type and contextual factors does explain unprotected sex for subsets of married and unmarried men. PMID:20563636</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/1236923','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/1236923"><span id="translatedtitle">Compression Algorithm <span class="hlt">Analysis</span> of In-Situ (S)TEM Video: Towards Automatic <span class="hlt">Event</span> Detection and Characterization</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.</p> <p>2015-09-23</p> <p>Precise <span class="hlt">analysis</span> of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses <span class="hlt">analysis</span> of video compression statistics for detecting and characterizing <span class="hlt">events</span> in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect <span class="hlt">events</span> in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each <span class="hlt">analysis</span>. Going forward, an algorithm for detecting and possibly describing <span class="hlt">events</span> automatically can be written based on statistic(s) for each data type.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005JCoAM.184..320Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005JCoAM.184..320Z"><span id="translatedtitle">Modeling and <span class="hlt">analysis</span> of early <span class="hlt">events</span> in T-lymphocyte antigen-activated intracellular-signaling pathways</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zheng, Yanan; Balakrishnan, Venkataramanan; Buzzard, Greg; Geahlen, Robert; Harrison, Marietta; Rundell, Ann</p> <p>2005-12-01</p> <p>The T-cell antigen-activated signaling pathway is a highly regulated intracellular biochemical system that is crucial for initiating an appropriate adaptive immune response. To improve the understanding of the complex regulatory mechanisms controlling the early <span class="hlt">events</span> in T-cell signaling, a detailed mathematical model was developed that utilizes ordinary differential equations to describe chemical reactions of the signaling pathway. The model parameter values were constrained by experimental data on the activation of a specific signaling intermediate and indicated an initial rapid cascade of phosphorylation <span class="hlt">events</span> followed by a comparatively slow signal downregulation. Nonlinear <span class="hlt">analysis</span> of the model suggested that thresholding and bistability occur as a result of the embedded positive and negative feedback loops within the model. These nonlinear system properties may enhance the T-cell receptor specificity and provide sub-threshold noise filtering with switch-like behavior to ensure proper cell response. Additional <span class="hlt">analysis</span> using a reduced second-order model led to further understanding of the observed system behavior. Moreover, the interactions between the positive and negative feedback loops enabled the model to exhibit, among a variety of other feasible dynamics, a sustained oscillation that corresponds to a stable limit cycle in the two-dimensional phase plane. Quantitative <span class="hlt">analysis</span> in this paper has helped identify potential regulatory mechanisms in the early T-cell signaling <span class="hlt">events</span>. This integrated approach provides a framework to quantify and discover the ensemble of interconnected T-cell antigen-activated signaling pathways from limited experimental data.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2013EGUGA..15.4368S&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2013EGUGA..15.4368S&link_type=ABSTRACT"><span id="translatedtitle"><span class="hlt">Analysis</span> of Microphysics Mechanisms in Icing Aircraft <span class="hlt">Events</span>: A Case Study</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sanchez, Jose Luis; Fernández, Sergio; Gascón, Estibaliz; Weigand, Roberto; Hermida, Lucia; Lopez, Laura; García-Ortega, Eduardo</p> <p>2013-04-01</p> <p>The appearance of Supercooled Large Drops (SLD) can give way to icing aircraft. In these cases, atmospheric icing causes an unusual loss of support on the aircraft due to the rapid accumulation of ice on the wings or measurement instruments. There are two possible ways that SLD can be formed: The first is through a process called "warm nose", followed by "resupercooling". This process is usually associated with the entrance of warm fronts. The second possibility is that drops are formed by the process of condensation, and they grow, to sizes of at least 50 µm through processes of collision-coalescence, in environments with temperatures inferior to 0°C at all times, but without being able to produce a freezing process. Some authors point out that approximately 75% of gelling precipitation <span class="hlt">events</span> are produced as a consequence of this second situation. Within the framework of the TECOAGUA Project, a series of scientific flights were performed in order to collect data in cloud systems capable of producing precipitation during the winter period and their capacity to create environments favorable to "icing aircraft". These flights were carried out making use of a C 212-200 aircraft, belonging to the National Institute of Aerospatial Techniques (INTA), with a CAPS installed. On 1 February 2012, the C 212-200 aircraft took off from the airport in Torrejón de Ardoz (Madrid), flying about 70 km to stand upright on the northern side of the Central System, finding itself at a flight level of 3500 m, an elevated concentration of SLD at temperatures around -12°C, with liquid water content up to 0.44 g/m3, which provoked the accumulation of ice on the outline of the aircraft's wings, which required a cancellation of the flight. Surrounding the flight area, a microwave radiometer (MWR) was installed. An area of instability between 750 hPa and 600 hPa was identified in the vertical MWR profiles of temperature and humidity during the hour of the flight. It is mainly in this</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/11538031','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/11538031"><span id="translatedtitle"><span class="hlt">Analysis</span> of radiation risk from alpha particle component of solar particle <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cucinotta, F A; Townsend, L W; Wilson, J W; Golightly, M J; Weyland, M</p> <p>1994-01-01</p> <p>The solar particle <span class="hlt">events</span> (SPE) will contain a primary alpha particle component, representing a possible increase in the potential risk to astronauts during an SPE over the often studied proton component. We discuss the physical interactions of alpha particles important in describing the transport of these particles through spacecraft and body shielding. Models of light ion reactions are presented and their effects on energy and linear energy transfer (LET) spectra in shielding discussed. We present predictions of particle spectra, dose, and dose equivalent in organs of interest for SPE spectra typical of those occurring in recent solar cycles. The large <span class="hlt">events</span> of solar cycle 19 are found to have substantial increase in biological risk from alpha particles, including a large increase in secondary neutron production from alpha particle breakup. PMID:11538031</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22020989','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22020989"><span id="translatedtitle">Parental separation and child aggressive and internalizing behavior: an <span class="hlt">event</span> history calendar <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Averdijk, Margit; Malti, Tina; Eisner, Manuel; Ribeaud, Denis</p> <p>2012-04-01</p> <p>This study investigated the relationship between parental separation and aggressive and internalizing behavior in a large sample of Swiss children drawn from the ongoing Zurich Project on the Social Development of Children and Youths. Parents retrospectively reported life <span class="hlt">events</span> and problem behavior for the first 7 years of the child's life on a quarterly basis (N = 995; 28,096 time points) using an <span class="hlt">Event</span> History Calendar. The time sequences of separation and child problem behavior were analyzed. Parental separation affected both aggressive and internalizing behavior even when maternal depression, financial difficulties, and parental conflict were included. Parental separation exerted a direct effect on child problem behavior as well as an indirect effect via maternal depression.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.2208K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.2208K"><span id="translatedtitle">From <span class="hlt">event</span> <span class="hlt">analysis</span> to global lessons: disaster forensics for building resilience</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard</p> <p>2016-04-01</p> <p>With unprecedented growth in disaster risk, there is an urgent need for enhanced learning about and understanding disasters, particularly in relation to the trends in the drivers of increasing risk. Building on the disaster forensics field, we introduce the Post <span class="hlt">Event</span> Review Capability (PERC) methodology for systematically and holistically analyzing disaster <span class="hlt">events</span>, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalizable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilize the freely available PERC approach and contribute to building a repository of learnings on disaster risk management and resilience.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NHESS..16.1603K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NHESS..16.1603K"><span id="translatedtitle">From <span class="hlt">event</span> <span class="hlt">analysis</span> to global lessons: disaster forensics for building resilience</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard</p> <p>2016-07-01</p> <p>With unprecedented growth in disaster risk, there is an urgent need for enhanced learning and understanding of disasters, particularly in relation to the trends in drivers of increasing risk. Building on the disaster forensics field, we introduce the post-<span class="hlt">event</span> review capability (PERC) methodology for systematically and holistically analysing disaster <span class="hlt">events</span>, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalisable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilise the freely available PERC approach and contribute to building a repository of learning on disaster risk management and resilience.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20130008675','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20130008675"><span id="translatedtitle">Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-<span class="hlt">Event</span> Simulation <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Bradley, James R.</p> <p>2012-01-01</p> <p>This paper evaluates how quickly students can be trained to construct useful discrete-<span class="hlt">event</span> simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-<span class="hlt">event</span> simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011ASTRA...7....1P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011ASTRA...7....1P"><span id="translatedtitle">Solar particle <span class="hlt">event</span> <span class="hlt">analysis</span> using the standard radiation environment monitors: applying the neutron monitor's experience</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Papaioannou, A.; Mavromichalaki, H.; Gerontidou, M.; Souvatzoglou, G.; Nieminen, P.; Glover, A.</p> <p>2011-01-01</p> <p>The Standard Radiation Environment Monitor (SREM) is a particle detector developed by the European Space Agency for satellite applications with the main purpose to provide radiation hazard alarms to the host spacecraft. SREM units have been constructed within a radiation hardening concept and therefore are able to register extreme solar particle <span class="hlt">events</span> (SPEs). Large SPEs are registered at Earth, by ground based detectors as neutron monitors, in the form of Ground Level Enhancements of solar cosmic rays. In this work, a feasibility study of a possible radiation alert, deduced by SREM measurements was implemented for the <span class="hlt">event</span> of 20 January 2005. Taking advantage of the neutron monitor's experience, the steps of the GLE alert algorithm were put into practice on SREM measurements. The outcome was that SREM units did register the outgoing SPE on-time and that these could serve as indicators of radiation hazards, leading to successful alerts.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4127216','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4127216"><span id="translatedtitle">Covert Network <span class="hlt">Analysis</span> for Key Player Detection and <span class="hlt">Event</span> Prediction Using a Hybrid Classifier</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus</p> <p>2014-01-01</p> <p>National security has gained vital importance due to increasing number of suspicious and terrorist <span class="hlt">events</span> across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of <span class="hlt">events</span>. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=%22collective+violence%22&pg=2&id=ED173212','ERIC'); return false;" href="http://eric.ed.gov/?q=%22collective+violence%22&pg=2&id=ED173212"><span id="translatedtitle">Final Report for Dynamic Models for Causal <span class="hlt">Analysis</span> of Panel Data. Alternative Estimation Procedures for <span class="hlt">Event</span>-History <span class="hlt">Analysis</span>: A Monte Carlo Study. Part III, Chapter 5.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Carroll, Glenn R.; And Others</p> <p></p> <p>This document is part of a series of chapters described in SO 011 759. The chapter examines the merits of four estimators in the causal <span class="hlt">analysis</span> of <span class="hlt">event</span>-histories (data giving the number, timing, and sequence of changes in a categorical dependent variable). The four procedures are ordinary least squares, Kaplan-Meier least squares, maximum…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2009LNCS.5215....7G&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2009LNCS.5215....7G&link_type=ABSTRACT"><span id="translatedtitle">Max-plus Algebraic Tools for Discrete <span class="hlt">Event</span> Systems, Static <span class="hlt">Analysis</span>, and Zero-Sum Games</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gaubert, Stéphane</p> <p></p> <p>The max-plus algebraic approach of timed discrete <span class="hlt">event</span> systems emerged in the eighties, after the discovery that synchronization phenomena can be modeled in a linear way in the max-plus setting. This led to a number of results, like the determination of long term characteristics (throughput, stationary regime) by spectral theory methods or the representation of the input-output behavior by rational series.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015MSSP...58..308D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015MSSP...58..308D"><span id="translatedtitle">On the identification of piston slap <span class="hlt">events</span> in internal combustion engines using tribodynamic <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dolatabadi, N.; Theodossiades, S.; Rothberg, S. J.</p> <p>2015-06-01</p> <p>Piston slap is a major source of vibration and noise in internal combustion engines. Therefore, better understanding of the conditions favouring piston slap can be beneficial for the reduction of engine Noise, Vibration and Harshness (NVH). Past research has attempted to determine the exact position of piston slap <span class="hlt">events</span> during the engine cycle and correlate them to the engine block vibration response. Validated numerical/analytical models of the piston assembly can be very useful towards this aim, since extracting the relevant information from experimental measurements can be a tedious and complicated process. In the present work, a coupled simulation of piston dynamics and engine tribology (tribodynamics) has been performed using quasi-static and transient numerical codes. Thus, the inertia and reaction forces developed in the piston are calculated. The occurrence of piston slap <span class="hlt">events</span> in the engine cycle is monitored by introducing six alternative concepts: (i) the quasi-static lateral force, (ii) the transient lateral force, (iii) the minimum film thickness occurrence, (iv) the maximum energy transfer, (v) the lubricant squeeze velocity and (vi) the piston-impact angular duration. The validation of the proposed methods is achieved using experimental measurements taken from a single cylinder petrol engine in laboratory conditions. The surface acceleration of the engine block is measured at the thrust- and anti-thrust side locations. The correlation between the theoretically predicted <span class="hlt">events</span> and the measured acceleration signals has been satisfactory in determining piston slap incidents, using the aforementioned concepts. The results also exhibit good repeatability throughout the set of measurements obtained in terms of the number of <span class="hlt">events</span> occurring and their locations during the engine cycle.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.5402K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.5402K"><span id="translatedtitle">Quantitative <span class="hlt">analysis</span> of the 16-17 September 2013 resuspended ash <span class="hlt">event</span> in Iceland</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kylling, Arve; Beckett, Frances; Sigurdardottir, Gudmunda Maria; von Loewis, Sibylle; Witham, Claire</p> <p>2015-04-01</p> <p>In Iceland more than 20,000 km2 of sandy deserts are active with aeolian processes. Annually on average 34-135 days are dusty making it one of the dustiest areas of the world. Substantial amounts of dust are transported southward and deposited in the North-Atlantic possibly providing significant iron fertilization to regions deficient in iron. Volcanic ash including resuspended ash may have an adverse effect on ecosystems and human health, and resuspended ash levels may be high enough to cause problems to aviation. A strong gale force northerly wind prevailed over south east Iceland on 16-17 September, 2013. During this period ash from the recent eruptions of Eyjafjallajokull (2010) and Grimsvotn (2011) was resuspended into the air and blown southwards. The <span class="hlt">event</span> was captured by surface based optical particle counters (OPC) in Iceland, and cloudless skies south of Iceland made it possible to observe the resuspended ash by the Moderate Resolution Imaging Spectroradiometer (MODIS) as the ash was transported more than 320 km over the ocean. The aim of this study is to quantify the amount of ash that was resuspended during the <span class="hlt">event</span>. Simulations of the <span class="hlt">event</span> using the Numerical Atmospheric dispersion Modeling Environment (NAME) agree well with the location of the resuspended ash cloud observed by MODIS. By comparing the simulated height of the resuspended ash cloud to meteorological data we show that the maximum height of the cloud coincides with a temperature inversion at about 1300 m asl. The total mass column loading was retrieved from infrared MODIS channels using the ash cloud height identified from the dispersion model output. The OPC data provide surface ash concentrations. Using the satellite and OPC measurements the NAME dispersion model output was calibrated and the total resuspended ash amount for the whole <span class="hlt">event</span> estimated.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.1831H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.1831H"><span id="translatedtitle">What can be learned from combined <span class="hlt">event</span> runoff and tracer <span class="hlt">analysis</span> in a semi-arid, data-scarce catchment?</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hrachowitz, M.; Bohte, R.; Mul, M. L.; Bogaard, T. A.; Savenije, H. H. G.; Uhlenbrook, S.</p> <p>2012-04-01</p> <p>Hydrological processes in small catchments are not quite understood yet, which is true in particular for catchments in data scarce, semi-arid regions. This is in contrast with the need for a better understanding of water fluxes and the interactions between surface- and groundwater in order to facilitate sustainable water resources management in such environments, where both floods and droughts can result in severe crop loss. In this study, <span class="hlt">event</span> runoff coefficient <span class="hlt">analysis</span> and limited tracer data of four small, nested sub-catchments (0.4 - 25.3 km2) in a data scarce, semi-arid region of Tanzania helped to characterize the distinct response of the study catchments and to gain insights into the dominant runoff processes. The estimated <span class="hlt">event</span> runoff coefficients were very low and did not exceed 0.09. They were found to be significantly related to the 5-day antecedent precipitation totals as well as to base flow. This indicated a close relation to changes in soil moisture and thus potential switches in runoff generation processes. The time scales of the "direct flow" reservoirs, used to compute the <span class="hlt">event</span> runoff coefficients, were up to one order of magnitude reduced for extreme <span class="hlt">events</span>, compared to "average" <span class="hlt">events</span>. This suggested the activation of at least a third flow component, besides base- and direct flow, assumed to be infiltration overland flow. <span class="hlt">Analysis</span> of multiple tracers highlighted the importance of pre-<span class="hlt">event</span> water to total runoff, even during intense and high yield precipitation <span class="hlt">events</span>. It further illustrated the distinct nature of the catchments, in particular with respect to the available water storage, which was suggested by different degrees of tracer damping in the individual streams. The use of multiple tracers subsequently allowed estimating uncertainties in hydrograph separations arising from the use of different tracers. The results highlight the presence of considerable uncertainties, emphasizing the need for multiple tracers in order to avoid</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1712934H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1712934H"><span id="translatedtitle"><span class="hlt">Analysis</span> of Severe Weather <span class="hlt">Events</span> by Integration of Civil Protection Operation Data</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Heisterkamp, Tobias; Kox, Thomas</p> <p>2015-04-01</p> <p>In Germany, winter storms belong to those natural hazards responsible for the largest damages (GDV 2014). This is a huge challenge for the civil protection, especially in metropolitan areas like Berlin. Nowadays, large-scale storm <span class="hlt">events</span> are generally well predictable, but detailed forecasts on urban district or even street level are still out of range. Fire brigades, as major stakeholder covering severe weather consequences, operate on this small scale and in the whole area due to their juris-diction. For forensic investigation of disasters this presentation offers an additional approach by using the documentation of fire brigade operations as a new data source. Hazard dimensions and conse-quences of severe weather <span class="hlt">events</span> are reconstructed via GIS-based analyses of these operations. Local case studies of recent storms are used as a comparison and as an additional information to three WMO weather stations in Berlin. Thus, hot spots of these selected <span class="hlt">events</span> can be identified by operation site accumulations. Further indicators for Berlin are added to detect aspects that de-termine vulnerabilities. The conclusion discusses the potential of this approach as well as possible benefits of integration into warning systems.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004ESASP.550E..37B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004ESASP.550E..37B"><span id="translatedtitle">Remote Sensing for Ground Deformation <span class="hlt">Analysis</span> during the Eruptive <span class="hlt">Event</span> of July 2001 at Mt. Etna</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bonforte, A.; Colesanti, C.; Ferretti, A.; Guglielmino, F.; Palano, M.; Prati, C.; Puglisi, G.; Rocca, F.</p> <p>2004-06-01</p> <p>The July - August 2001 Mt Etna eruption has been studied using the DinSAR technique and monitored through both continuous GPS measurements on a network of permanent and static stations, as well as daily static and kinematic GPS measurements, made by INGV-CT, on geodetic networks. This eruption, one of the most important lateral eruptive <span class="hlt">events</span> in the last 30 years, was characterized by an unusual eruptive style, with lava flow emissions at different altitudes along a complex fracture system. A seismic swarm started on July 12th 2001, with most of <span class="hlt">events</span> located beneath the upper southern flank of the volcano. The number of the daily <span class="hlt">events</span> gradually decreased until July 18th. The eruption began with the opening of the eruptive vents occurred between 2700, 2500 and 2100 m of altitude, from July 17th to 19th. Lava flows came out from these vents and covered the upper and middle southern flanks of the volcano. A little flow came out also on the north-eastern flank, from an eruptive fracture opened later in the Valle del Leone area.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014NHESD...2.4363G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014NHESD...2.4363G"><span id="translatedtitle"><span class="hlt">Analysis</span> of extreme wave <span class="hlt">events</span> in the southern coast of Brazil</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Guimarães, P. V.; Farina, L.; Toldo, E.</p> <p>2014-06-01</p> <p>Using the model SWAN, high waves on the Southwestern Atlantic generated by extra-tropical cyclones are simulated from 2000 to 2010 and their impact on the Rio Grande do Sul coast is studied. The modeled waves are compared with buoy data and good agreement is found. The six extreme <span class="hlt">events</span> in the period which presented significant wave heights above 5 m, on a particular point of interest, are investigated in detail. It is found that the cyclogenetic pattern between the latitudes 31.5 and 34° S, is the most favorable for developing high waves. Hovmöller diagrams for deep water show that the region between the south of Rio Grande do Sul up to latitude 31.5° S is the most energetic during a cyclone's passage, although the <span class="hlt">event</span> of May 2008 indicate that the location of this region can vary, depending on the cyclone's displacement. On the oher hand, the Hovmöller diagrams for shallow water show that the different shoreface morphologies were responsable for focusing or dissipating the waves' energy; the regions found are in agreement with the observations of erosion and progradation regions. It can be concluded that some of the urban areas of the beaches of Hermenegildo, Cidreira, Pinhal, Tramandaí, Imbé and Torres have been more exposed during the extreme wave <span class="hlt">events</span> at Rio Grande do Sul coast, and are more vulnerable to this natural hazard.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/5552391','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/5552391"><span id="translatedtitle">Solar flare protection for manned lunar missions - <span class="hlt">Analysis</span> of the October 1989 proton flare <span class="hlt">event</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Simonsen, L.C.; Nealy, J.E.; Townsend, L.W.; Sauer, H.H. NOAA, Space Environment Laboratory, Boulder, CL )</p> <p>1991-07-01</p> <p>Several large solar proton <span class="hlt">events</span> occurred in the latter half of 1989. For a moderately shielded spacecraft in free space, the potential exposure would have been greatest for the flare which occurred between October 19 to 27, 1989. The temporal variations of the proton energy spectra at approximately 1 AU were monitored by the GOES-7 satellite. These data, recorded and processed at the NOAA-Boulder Space Environment Laboratory, provide the opportunity to analyze dose rates and cumulative doses which might be incurred by astronauts in transit to, or on, the moon. Of particular importance in such an <span class="hlt">event</span> is the time development of exposure in the early phases of the flare, for which dose rates may range over many orders of magnitude in the first few hours. The cumulative dose as a function of time for the entire <span class="hlt">event</span> is also predicted. In addition to basic shield calculations, dose rate contours are constructed for flare shelters in free-space and on the lunar surface. 14 refs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014NHESS..14.3195G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014NHESS..14.3195G"><span id="translatedtitle"><span class="hlt">Analysis</span> of extreme wave <span class="hlt">events</span> on the southern coast of Brazil</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Guimarães, P. V.; Farina, L.; Toldo, E. E., Jr.</p> <p>2014-12-01</p> <p>Using the wave model SWAN (simulating waves nearshore), high waves on the southwestern Atlantic generated by extra-tropical cyclones are simulated from 2000 to 2010, and their impact on the Rio Grande do Sul (RS) coast is studied. The modeled waves are compared with buoy data and good agreement is found. The six extreme <span class="hlt">events</span> in the period that presented significant wave heights above 5 m, on a particular point of interest, are investigated in detail. It is found that the cyclogenetic pattern between the latitudes 31.5 and 34° S is the most favorable for developing high waves. Hovmöller diagrams for deep water show that the region between the south of Rio Grande do Sul up to a latitude of 31.5° S is the most energetic during a cyclone's passage, although the <span class="hlt">event</span> of May 2008 indicates that the location of this region can vary, depending on the cyclone's displacement. On the other hand, the Hovmöller diagrams for shallow water show that the different shoreface morphologies were responsible for focusing or dissipating the waves' energy; the regions found are in agreement with the observations of erosion and progradation regions. It can be concluded that some of the urban areas of the beaches of Hermenegildo, Cidreira, Pinhal, Tramandaí, Imbé and Torres have been more exposed during the extreme wave <span class="hlt">events</span> on the Rio Grande do Sul coast, and are more vulnerable to this natural hazard.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AAS...22335524T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AAS...22335524T"><span id="translatedtitle">A Spectral <span class="hlt">Analysis</span> of a Rare "Dwarf Eat Dwarf" Cannibalism <span class="hlt">Event</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Theakanath, Kuriakose; Toloba, E.; Guhathakurta, P.; Romanowsky, A. J.; Ramachandran, N.; Arnold, J.</p> <p>2014-01-01</p> <p>We have used Keck/DEIMOS to conduct the first detailed spectroscopic study of the recently discovered stellar stream in the Large Magellanic Cloud analog NGC 4449. Martinez-Delgado et al. (2012), using the tip of the red giant branch (TRGB), found that both objects, the stream and NGC 4449, are at the same distance, which suggests that this stream is the remnant of the first ongoing dwarf-dwarf cannibalism <span class="hlt">event</span> known so far. Learning about the orbital properties of this <span class="hlt">event</span> is a powerful tool to constrain the physical conditions involved in dwarf-dwarf merger <span class="hlt">events</span>. The low surface-brightness of this structure makes impossible to obtain integrated light spectroscopic measurements, and its distance (3.8 Mpc) is too large as to observe stars individually. In the color-magnitude diagram of the stellar stream there is an excess of objects brighter than the TRGB which are potential star blends. We designed our DEIMOS mask to contain as many of these objects as possible and, while some of them turned out to be background galaxies, a handful happened to be star blends in the stream. Our velocity measurements along the stream prove that it is gravitationally bound to NGC 4449 and put strong constraints on the orbital properties of the infall. This research was carried out under the auspices of UCSC's Science Internship Program. We thank the National Science Foundation for funding support. ET was supported by a Fulbright fellowship.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25639077','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25639077"><span id="translatedtitle">[Characteristic <span class="hlt">analysis</span> of a multi-day pollution <span class="hlt">event</span> in Chang-Zhu-Tan Metropolitan Area during October 2013].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liao, Zhi-heng; Fan, Shao-jia; Huang, Juan; Sun, Jia-ren</p> <p>2014-11-01</p> <p>Chang-Zhu-Tan Metropolitan Area experienced a typical multi-day pollution <span class="hlt">event</span> in October 2013. Based on the air pollution index, conventional pollutants observations, surface meteorological observations and sounding data, the relationships of air pollution, large-scale circumfluence and boundary layer meteorology of this <span class="hlt">event</span> were comprehensively analyzed. Additionally, the sources and transport paths of pollutions were investigated by application of satellite remote sensing data and HYSPLIT4 model. The results showed that pollutants gradually accumulated in the earlier stage of the <span class="hlt">event</span> (October 21th to 26th) , while in the later stage (October 27th to 31th) the characteristic pollutants of crop residue burning (PM2.5, CO, NO2) sharply increased. The deterioration of air quality in the later stage was mainly related to the remote transport of pollutants caused by straw burning. <span class="hlt">Analysis</span> of simulations of HYSPLIT4 model and fire spots showed that the currents mainly came from Anhui and Hubei Province in the earlier stage, while in the later stage they were mainly from Jiangxi Province where fire spots were intensively located. Stable atmospheric stratification caused by steady uniform high-pressure field and slight wind due to the confrontation of cold and warm currents greatly contributed to the development, maintainability and reinforcement of the pollution <span class="hlt">event</span>. The remote transport of pollutants had a significant impact on ambient air quality of Chang-Zhu-Tan Metropolitan Area.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120013096','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120013096"><span id="translatedtitle">Bayesian <span class="hlt">Analysis</span> for Risk Assessment of Selected Medical <span class="hlt">Events</span> in Support of the Integrated Medical Model Effort</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.</p> <p>2012-01-01</p> <p>The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical <span class="hlt">events</span> using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical <span class="hlt">events</span> occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical <span class="hlt">analysis</span> is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical <span class="hlt">events</span> occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical <span class="hlt">events</span>: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/25639077','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/25639077"><span id="translatedtitle">[Characteristic <span class="hlt">analysis</span> of a multi-day pollution <span class="hlt">event</span> in Chang-Zhu-Tan Metropolitan Area during October 2013].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liao, Zhi-heng; Fan, Shao-jia; Huang, Juan; Sun, Jia-ren</p> <p>2014-11-01</p> <p>Chang-Zhu-Tan Metropolitan Area experienced a typical multi-day pollution <span class="hlt">event</span> in October 2013. Based on the air pollution index, conventional pollutants observations, surface meteorological observations and sounding data, the relationships of air pollution, large-scale circumfluence and boundary layer meteorology of this <span class="hlt">event</span> were comprehensively analyzed. Additionally, the sources and transport paths of pollutions were investigated by application of satellite remote sensing data and HYSPLIT4 model. The results showed that pollutants gradually accumulated in the earlier stage of the <span class="hlt">event</span> (October 21th to 26th) , while in the later stage (October 27th to 31th) the characteristic pollutants of crop residue burning (PM2.5, CO, NO2) sharply increased. The deterioration of air quality in the later stage was mainly related to the remote transport of pollutants caused by straw burning. <span class="hlt">Analysis</span> of simulations of HYSPLIT4 model and fire spots showed that the currents mainly came from Anhui and Hubei Province in the earlier stage, while in the later stage they were mainly from Jiangxi Province where fire spots were intensively located. Stable atmospheric stratification caused by steady uniform high-pressure field and slight wind due to the confrontation of cold and warm currents greatly contributed to the development, maintainability and reinforcement of the pollution <span class="hlt">event</span>. The remote transport of pollutants had a significant impact on ambient air quality of Chang-Zhu-Tan Metropolitan Area. PMID:25639077</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/16316757','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/16316757"><span id="translatedtitle">Getting the right blood to the right patient: the contribution of near-miss <span class="hlt">event</span> reporting and barrier <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kaplan, H S</p> <p>2005-11-01</p> <p>Safety and reliability in blood transfusion are not static, but are dynamic non-<span class="hlt">events</span>. Since performance deviations continually occur in complex systems, their detection and correction must be accomplished over and over again. Non-conformance must be detected early enough to allow for recovery or mitigation. Near-miss <span class="hlt">events</span> afford early detection of possible system weaknesses and provide an early chance at correction. National <span class="hlt">event</span> reporting systems, both voluntary and involuntary, have begun to include near-miss reporting in their classification schemes, raising awareness for their detection. MERS-TM is a voluntary safety reporting initiative in transfusion. Currently 22 hospitals submit reports anonymously to a central database which supports <span class="hlt">analysis</span> of a hospital's own data and that of an aggregate database. The system encourages reporting of near-miss <span class="hlt">events</span>, where the patient is protected from receiving an unsuitable or incorrect blood component due to a planned or unplanned recovery step. MERS-TM data suggest approximately 90% of <span class="hlt">events</span> are near-misses, with 10% caught after issue but before transfusion. Near-miss reporting may increase total reports ten-fold. The ratio of near-misses to <span class="hlt">events</span> with harm is 339:1, consistent with other industries' ratio of 300:1, which has been proposed as a measure of reporting in <span class="hlt">event</span> reporting systems. Use of a risk matrix and an <span class="hlt">event</span>'s relation to protective barriers allow prioritization of these <span class="hlt">events</span>. Near-misses recovered by planned barriers occur ten times more frequently then unplanned recoveries. A bedside check of the patient's identity with that on the blood component is an essential, final barrier. How the typical two person check is performed, is critical. Even properly done, this check is ineffective against sampling and testing errors. Blood testing at bedside just prior to transfusion minimizes the risk of such upstream <span class="hlt">events</span>. However, even with simple and well designed devices, training may be a</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012SoPh..281..333M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012SoPh..281..333M"><span id="translatedtitle">Scientific <span class="hlt">Analysis</span> within SEPServer - New Perspectives in Solar Energetic Particle Research: The Case Study of the 13 July 2005 <span class="hlt">Event</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Malandraki, O. E.; Agueda, N.; Papaioannou, A.; Klein, K.-L.; Valtonen, E.; Heber, B.; Dröge, W.; Aurass, H.; Nindos, A.; Vilmer, N.; Sanahuja, B.; Kouloumvakos, A.; Braune, S.; Preka-Papadema, P.; Tziotziou, K.; Hamadache, C.; Kiener, J.; Tatischeff, V.; Riihonen, E.; Kartavykh, Y.; Rodríguez-Gasén, R.; Vainio, R.</p> <p>2012-11-01</p> <p>Solar energetic particle (SEP) <span class="hlt">events</span> are a key ingredient of solar-terrestrial physics both for fundamental research and space weather applications. Multi-satellite observations are an important and incompletely exploited tool for studying the acceleration and the coronal and interplanetary propagation of the particles. While STEREO uses for this diagnostic two identical sets of instrumentation, there are many earlier observations carried out with different spacecraft. It is the aim of the SEPServer project to make these data and <span class="hlt">analysis</span> tools available to a broad user community. The consortium will carry out data-driven <span class="hlt">analysis</span> and simulation-based data <span class="hlt">analysis</span> capable of deconvolving the effects of interplanetary transport and solar injection from SEP observations, and will compare the results with the electromagnetic signatures. The tools and results will be provided on the web server of the project in order to facilitate further <span class="hlt">analysis</span> by the research community. This paper describes the data products and <span class="hlt">analysis</span> strategies with one specific <span class="hlt">event</span>, the case study of 13 July 2005. The release time of protons and electrons are derived using data-driven and simulation-based analyses, and compared with hard X-ray and radio signatures. The interconnection of the experimental and the simulation-based results are discussed in detail.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/6737551','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/6737551"><span id="translatedtitle">Human factors <span class="hlt">analysis</span> and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized <span class="hlt">event</span>-tree <span class="hlt">analysis</span> technique. [CETAT computer program</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Casey, S.M.; Deretsky, Z.</p> <p>1980-08-01</p> <p>This document provides detailed instructions for using the Computerized <span class="hlt">Event</span>-Tree <span class="hlt">Analysis</span> Technique (CETAT), a program designed to assist a human factors analyst in predicting <span class="hlt">event</span> probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT <span class="hlt">analysis</span>, (b) develop operator performance data, (c) enter an <span class="hlt">event</span>-tree structure, (d) modify a data base, and (e) analyze <span class="hlt">event</span> paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated <span class="hlt">event</span> probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3445446','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3445446"><span id="translatedtitle">Effect of Statins on Venous Thromboembolic <span class="hlt">Events</span>: A Meta-<span class="hlt">analysis</span> of Published and Unpublished Evidence from Randomised Controlled Trials</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Rahimi, Kazem; Bhala, Neeraj; Kamphuisen, Pieter; Emberson, Jonathan; Biere-Rafi, Sara; Krane, Vera; Robertson, Michele; Wikstrand, John; McMurray, John</p> <p>2012-01-01</p> <p>Background It has been suggested that statins substantially reduce the risk of venous thromboembolic <span class="hlt">events</span>. We sought to test this hypothesis by performing a meta-<span class="hlt">analysis</span> of both published and unpublished results from randomised trials of statins. Methods and Findings We searched MEDLINE, EMBASE, and Cochrane CENTRAL up to March 2012 for randomised controlled trials comparing statin with no statin, or comparing high dose versus standard dose statin, with 100 or more randomised participants and at least 6 months' follow-up. Investigators were contacted for unpublished information about venous thromboembolic <span class="hlt">events</span> during follow-up. Twenty-two trials of statin versus control (105,759 participants) and seven trials of an intensive versus a standard dose statin regimen (40,594 participants) were included. In trials of statin versus control, allocation to statin therapy did not significantly reduce the risk of venous thromboembolic <span class="hlt">events</span> (465 [0.9%] statin versus 521 [1.0%] control, odds ratio [OR] = 0.89, 95% CI 0.78–1.01, p = 0.08) with no evidence of heterogeneity between effects on deep vein thrombosis (266 versus 311, OR 0.85, 95% CI 0.72–1.01) and effects on pulmonary embolism (205 versus 222, OR 0.92, 95% CI 0.76–1.12). Exclusion of the trial result that provided the motivation for our meta-<span class="hlt">analysis</span> (JUPITER) had little impact on the findings for venous thromboembolic <span class="hlt">events</span> (431 [0.9%] versus 461 [1.0%], OR = 0.93 [95% CI 0.82–1.07], p = 0.32 among the other 21 trials). There was no evidence that higher dose statin therapy reduced the risk of venous thromboembolic <span class="hlt">events</span> compared with standard dose statin therapy (198 [1.0%] versus 202 [1.0%], OR = 0.98, 95% CI 0.80–1.20, p = 0.87). Risk of bias overall was small but a certain degree of effect underestimation due to random error cannot be ruled out. Please see later in the article for the Editors' Summary. Conclusions The findings from this meta-<span class="hlt">analysis</span> do not support the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=violent&pg=7&id=EJ1010254','ERIC'); return false;" href="http://eric.ed.gov/?q=violent&pg=7&id=EJ1010254"><span id="translatedtitle">Joint Utility of <span class="hlt">Event</span>-Dependent and Environmental Crime <span class="hlt">Analysis</span> Techniques for Violent Crime Forecasting</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.</p> <p>2013-01-01</p> <p>Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern <span class="hlt">analysis</span>, hotspot mapping, near-repeat <span class="hlt">analysis</span>, and risk terrain modeling. One approach to crime <span class="hlt">analysis</span> suggests that the best way to predict future crime occurrence is to use past behavior, such as…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.8466F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.8466F"><span id="translatedtitle">Remote sensing <span class="hlt">analysis</span> of the Tiber River sediment plume (Tyrrhenian Sea): spectral signature of erratic vs. persistent <span class="hlt">events</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Falcini, Federico; Di Cicco, Annalisa; Pitarch, Jaime; Marullo, Salvatore; Colella, Simone; Volpe, Gianluca; Nardin, William; Margiotta, Francesca; Santoleri, Rosalia</p> <p>2016-04-01</p> <p>During the last decade, several regions along the western Tyrrhenian coast have been dramatically affected by intense river runoffs, which delivered a significant amount of sediment off and along shore. A crucial question that coastal geomorphologists and marine scientists need to face is about the fate and impact of this impulsive sediment load, especially with respect to the historical trend, seasonal variability, and persistent <span class="hlt">events</span>. A satellite-based <span class="hlt">analysis</span> of these sediment discharges is a key ingredient for such a study since it represents the primary dataset for the recognition of coastal patterns of Total Suspended Matter (TSM) that may reflect erosional or depositional processes along the coats. On this regard, we developed and implemented a TSM regional product from remote sensing, which was calibrated and validated by in situ measurements collected in the Tyrrhenian Sea. We discuss spatial patterns and spectral signature of the TSM that we observe during the 2012 high river discharge <span class="hlt">event</span> of the Tiber River. Our <span class="hlt">analysis</span> gives some insights on the main differences of the geomorphological impacts related to erratic vs persistent <span class="hlt">events</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/27131183','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/27131183"><span id="translatedtitle">Predicting ground contact <span class="hlt">events</span> for a continuum of gait types: An application of targeted machine learning using principal component <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Osis, Sean T; Hettinga, Blayne A; Ferber, Reed</p> <p>2016-05-01</p> <p>An ongoing challenge in the application of gait <span class="hlt">analysis</span> to clinical settings is the standardized detection of temporal <span class="hlt">events</span>, with unobtrusive and cost-effective equipment, for a wide range of gait types. The purpose of the current study was to investigate a targeted machine learning approach for the prediction of timing for foot strike (or initial contact) and toe-off, using only kinematics for walking, forefoot running, and heel-toe running. Data were categorized by gait type and split into a training set (∼30%) and a validation set (∼70%). A principal component <span class="hlt">analysis</span> was performed, and separate linear models were trained and validated for foot strike and toe-off, using ground reaction force data as a gold-standard for <span class="hlt">event</span> timing. Results indicate the model predicted both foot strike and toe-off timing to within 20ms of the gold-standard for more than 95% of cases in walking and running gaits. The machine learning approach continues to provide robust timing predictions for clinical use, and may offer a flexible methodology to handle new <span class="hlt">events</span> and gait types. PMID:27131183</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AdSR....6..187M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AdSR....6..187M"><span id="translatedtitle">Synoptic-mesoscale <span class="hlt">analysis</span> and numerical modeling of a tornado <span class="hlt">event</span> on 12 February 2010 in northern Greece</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Matsangouras, I. T.; Nastos, P. T.; Pytharoulis, I.</p> <p>2011-07-01</p> <p>Tornadoes are furious convective weather phenomena, with the maximum frequency over Greece during the cold period (autumn, winter).This study analyzes the tornado <span class="hlt">event</span> that occurred on 12 February 2010 near Vrastama village, at Chalkidiki's prefecture, a non urban area 45 km southeast of Thessaloniki in northern Greece. The tornado developed approximately between 17:10 and 17:35 UTC and was characterized as F2 (Fujita Scale). The tornado <span class="hlt">event</span> caused several damages to an industrial building and at several olive-tree farms. A synoptic survey is presented along with satellite images, radar products and vertical profile of the atmosphere. Additionally, the nonhydrostatic WRF-ARW atmospheric numerical model (version 3.2.0) was utilized in <span class="hlt">analysis</span> and forecast mode using very high horizontal resolution (1.333 km × 1.333 km) in order to represent the ambient atmospheric conditions. A comparison of statistical errors between WRF-ARW forecasts and ECMWF <span class="hlt">analysis</span> is presented, accompanied with LGTS 12:00 UTC soundings (Thessaloniki Airport) and forecast soundings in order to verify the WRF-ARW model. Additionally, a comparison between WRF-ARW and ECMWF thermodynamic indices is also presented. The WRF-ARW high spatial resolution model appeared to simulate with significant accuracy a severe convective <span class="hlt">event</span> with a lead period of 18 h.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/10156743','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/10156743"><span id="translatedtitle"><span class="hlt">Analysis</span> and modeling of flow blockage-induced steam explosion <span class="hlt">events</span> in the High-Flux Isotope Reactor</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Taleyarkhan, R.P.; Georgevich, V.; Lestor, C.W.; Gat, U.; Lepard, B.L.; Cook, D.H.; Freels, J.; Chang, S.J.; Luttrell, C.; Gwaltney, R.C.; Kirkpatrick, J.</p> <p>1993-06-01</p> <p>This paper provides a perspective overview of the <span class="hlt">analysis</span> and modeling work done to evaluate the threat from steam explosion loads in the High-Flux Isotope Reactor during flow blockage <span class="hlt">events</span>. The overall workscope included modeling and <span class="hlt">analysis</span> of core melt initiation, melt propagation, bounding and best-estimate steam explosion energetics, vessel failure from fracture, bolts failure from exceedance of elastic limits, and finally, missile evolution and transport. Aluminum ignition was neglected. Evaluations indicated that a thermally driven steam explosion with more than 65 MJ of energy insertion in the core region over several miliseconds would be needed to cause a sufficiently energetic missile with a capacity to cause early confinement failure. This amounts to about 65% of the HFIR core mass melting and participating in a steam explosion. Conservative melt propagation analyses have indicated that at most only 24% of the HFIR core mass could melt during flow blockage <span class="hlt">events</span> under full-power conditions. Therefore, it is judged that the HFIR vessel and top head structure will be able to withstand loads generated from thermally driven steam explosions initiated by any credible flow blockage <span class="hlt">event</span>. A substantial margin to safety was demonstrated.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/6593144','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/6593144"><span id="translatedtitle"><span class="hlt">Analysis</span> and modeling of flow blockage-induced steam explosion <span class="hlt">events</span> in the High-Flux Isotope Reactor</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Taleyarkhan, R.P.; Georgevich, V.; Lestor, C.W.; Gat, U.; Lepard, B.L.; Cook, D.H.; Freels, J.; Chang, S.J.; Luttrell, C.; Gwaltney, R.C.; Kirkpatrick, J.</p> <p>1993-01-01</p> <p>This paper provides a perspective overview of the <span class="hlt">analysis</span> and modeling work done to evaluate the threat from steam explosion loads in the High-Flux Isotope Reactor during flow blockage <span class="hlt">events</span>. The overall workscope included modeling and <span class="hlt">analysis</span> of core melt initiation, melt propagation, bounding and best-estimate steam explosion energetics, vessel failure from fracture, bolts failure from exceedance of elastic limits, and finally, missile evolution and transport. Aluminum ignition was neglected. Evaluations indicated that a thermally driven steam explosion with more than 65 MJ of energy insertion in the core region over several miliseconds would be needed to cause a sufficiently energetic missile with a capacity to cause early confinement failure. This amounts to about 65% of the HFIR core mass melting and participating in a steam explosion. Conservative melt propagation analyses have indicated that at most only 24% of the HFIR core mass could melt during flow blockage <span class="hlt">events</span> under full-power conditions. Therefore, it is judged that the HFIR vessel and top head structure will be able to withstand loads generated from thermally driven steam explosions initiated by any credible flow blockage <span class="hlt">event</span>. A substantial margin to safety was demonstrated.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMNH13C1942N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMNH13C1942N"><span id="translatedtitle">Identification of synoptic precursors to extreme precipitation <span class="hlt">events</span> in the Swiss Alps by the <span class="hlt">analysis</span> of backward trajectories</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Nguyen, L.; Horton, P.; Jaboyedoff, M.</p> <p>2015-12-01</p> <p>One of the most expensive natural disasters in Switzerland consists in floods related to heavy precipitation. The occurrence of heavy rainfall may induce landslides and debris flows, as observed during three major precipitation <span class="hlt">events</span> that occurred recently in the Swiss Alps (August 1987, September 1993 and October 2000). Even though all these <span class="hlt">events</span> took place under a southerly circulation, especially in autumn, not all southerly circulations lead to heavy precipitation. Although many studies tried to understand them, they are still difficult to forecast. Therefore, this work aims to identify synoptic precursors to such <span class="hlt">events</span> throughout backward trajectories <span class="hlt">analysis</span>. Part one tests as many combinations of tools, datasets and methods as possible in order to compare the trajectories in the case of heavy precipitations in the Alps and to reduce the number of models to be assessed for the second part by selecting the most relevant. As a result, we removed models yielding to similar results by using an absolute horizontal transport deviation measure (ATEH). The trajectories were processed with tools (HYSPLIT, a Matlab script developed at the University of Lausanne, and METEX) based on different Reanalysis (NCEP/NCAR, ECMWF, Japanese and NASA). Moreover, different types of trajectories (3D, isobaric, isentropic, isosigma, and constant density) have been used. As a result, 21 trajectory models were compared, and 9 were selected. Results show that most of the differences between trajectories are mainly related to the dataset rather than to the model. In part two, the 9 selected models were used to search precursors leading to heavy precipitations. 10-days backward trajectories were processed for the Binn station at different pressure levels, for all the days between 1961 and 2014 characterized by a southerly circulation in autumn. Based on these trajectories, two <span class="hlt">analysis</span> for the identification of precursors were conducted. First, the ATEH was used to assess</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011ACPD...1115631M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011ACPD...1115631M"><span id="translatedtitle"><span class="hlt">Analysis</span> of ΔO2/ΔCO2 ratios for the pollution <span class="hlt">events</span> observed at Hateruma Island, Japan</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Minejima, C.; Kubo, M.; Tohjima, Y.; Yamagishi, H.; Koyama, Y.; Maksyutov, S.; Kita, K.; Mukai, H.</p> <p>2011-05-01</p> <p>In-situ observations of atmospheric CO2 and O2 concentrations at Hateruma Island (HAT, 24° N, 124° E) often show synoptic scale pollution <span class="hlt">events</span> when air masses are transported from East Asian source regions. We calculate the regression slopes (-ΔO2/ΔCO2 molar ratios) of the correlation plots between O2 and CO2 for selected pollution <span class="hlt">events</span> observed between October 2006 and December 2008. The observed -ΔO2/ΔCO2 ratios vary from 1.0 to 1.7. Categorizing the air mass origins for the pollution <span class="hlt">events</span> by using back trajectory <span class="hlt">analysis</span>, we find that there is a significant difference in the average -ΔO2/ΔCO2 ratios between <span class="hlt">events</span> from China (1.14±0.12, n = 25) and Japan/Korea (1.37±0.15, n = 16). These values are comparable to the -O2:CO2 molar exchange ratios, which are estimated from the national fossil fuel inventories from CDIAC. Simulations using a particle dispersion model reveal that the pollution <span class="hlt">events</span> at HAT are predominantly CO2 emissions from the combustion of fossil fuels in East Asian countries, which is consistent with the above observational results. Although the average value of the model-predicted -ΔO2/ΔCO2 ratios for Japan/Korea origin is underestimated in comparison with the observation, that for China origin agree well with the observation. The sensitivity experiment suggests that the -ΔO2/ΔCO2 ratio at HAT reflects about 90% of the change in the -O2:CO2 exchange ratio for the fossil carbon emissions from China.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120003001','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120003001"><span id="translatedtitle"><span class="hlt">Analysis</span> of the March 30, 2011 Hail <span class="hlt">Event</span> at Shuttle Launch Pad 39A</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lane, John E.; Doesken, Nolan J.; Kasparis, Takis C.; Sharp, David W.</p> <p>2012-01-01</p> <p>The Kennedy Space Center (KSC) Hail Monitor System, a joint effort of the NASA KSC Physics Lab and the KSC Engineering Services Contract (ESC) Applied Technology Lab, was first deployed for operational testing in the fall of 2006. Volunteers from the Community Collaborative Rain, Hail, and Snow Network (CoCoRaHS) in conjunction with Colorado State University have been instrumental in validation testing using duplicate hail monitor systems at sites in the hail prone high plains of Colorado. The KSC Hail Monitor System (HMS), consisting of three stations positioned approximately 500 ft from the launch pad and forming an approximate equilateral triangle, as shown in Figure 1, was first deployed to Pad 39B for support of STS-115. Two months later, the HMS was deployed to Pad 39A for support of STS-116. During support of STS-117 in late February 2007, an unusually intense (for Florida standards) hail <span class="hlt">event</span> occurred in the immediate vicinity of the exposed space shuttle and launch pad. Hail data of this <span class="hlt">event</span> was collected by the HMS and analyzed. Support of STS-118 revealed another important application of the hail monitor system. Ground Instrumentation personnel check the hail monitors daily when a vehicle is on the launch pad, with special attention after any storm suspected of containing hail. If no hail is recorded by the HMS, the vehicle and pad inspection team has no need to conduct a thorough inspection of the vehicle immediately following a storm. On the afternoon of July 13, 2007, hail on the ground was reported by observers at the Vertical Assembly Building (VAB) and Launch Control Center (LCC), about three miles west of Pad 39A, as well as at several other locations at KSC. The HMS showed no impact detections, indicating that the shuttle had not been damaged by any of the numerous hail <span class="hlt">events</span> which occurred on that day.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16036283','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16036283"><span id="translatedtitle">Recurrent <span class="hlt">event</span> <span class="hlt">analysis</span> of lapse and recovery in a smoking cessation clinical trial using bupropion.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wileyto, E Paul; Patterson, Freda; Niaura, Raymond; Epstein, Leonard H; Brown, Richard A; Audrain-McGovern, Janet; Hawk, Larry W; Lerman, Caryn</p> <p>2005-04-01</p> <p>We report a reanalysis of data from a prior study describing the <span class="hlt">event</span> history of quitting smoking aided by bupropion, using recurrent-<span class="hlt">event</span> models to determine the effect of the drug on occurrence of lapses and recoveries from lapse (resumption of abstinence). Data were collected on 1,070 subjects across two similar double-blind randomized clinical trials of bupropion versus placebo and fitted with separate Cox regression models for lapse and recovery. Analyses were split using discrete time-varying covariates between the treatment (weeks 1-10) and follow-up phases (end of treatment to 12 months). Bupropion was associated with slower lapse during treatment for both sexes, and being female was associated with faster lapse across both phases. Drug did not affect time to recovery for males but was associated with faster recovery among females, allowing women to recover as quickly as men. High levels of nicotine dependence did not affect time to lapse but were associated with slower recovery from lapse across treatment and follow-up phases. During the treatment phase, higher levels of baseline depression symptoms had no effect on time to lapse but were associated with slower recovery from lapse. Results highlight the asymmetry in factors preventing lapse versus promoting recovery. Specifically, dependence, depression symptoms, and a sex x drug interaction were found to affect recovery but not lapse. Further research disentangling lapse and recovery <span class="hlt">events</span> from summary abstinence measures is needed to help us develop interventions that take advantage of bupropion at its best and that compensate where it is weak.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17526890','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17526890"><span id="translatedtitle">Spatial <span class="hlt">analysis</span> of a large magnitude erosion <span class="hlt">event</span> following a Sierran wildfire.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Carroll, Erin M; Miller, Wally W; Johnson, Dale W; Saito, Laurel; Qualls, Robert G; Walker, Roger F</p> <p>2007-01-01</p> <p>High intensity wildfire due to long-term fire suppression and heavy fuels buildup can render watersheds highly susceptible to wind and water erosion. The 2002 "Gondola" wildfire, located just southeast of Lake Tahoe, NV-CA, was followed 2 wk later by a severe hail and rainfall <span class="hlt">event</span> that deposited 7.6 to 15.2 mm of precipitation over a 3 to 5 h time period. This resulted in a substantive upland ash and sediment flow with subsequent down-gradient riparian zone deposition. Point measurements and ESRI ArcView were applied to spatially assess source area contributions and the extent of ash and sediment flow deposition in the riparian zone. A deposition mass of 380 Mg of ash and sediment over 0.82 ha and pre-wildfire surface bulk density measurements were used in conjunction with two source area assessments to generate an estimation of 10.1 mm as the average depth of surface material eroded from the upland source area. Compared to previous measurements of erosion during rainfall simulation studies, the erosion of 1800 to 6700 g m(-2) mm(-1) determined from this study was as much as four orders of magnitude larger. Wildfire, followed by the single <span class="hlt">event</span> documented in this investigation, enhanced soil water repellency and contributed 17 to 67% of the reported 15 to 60 mm ky(-1) of non-glacial, baseline erosion rates occurring in mountainous, granitic terrain sites in the Sierra Nevada. High fuel loads now common to the Lake Tahoe Basin increase the risk that similar erosion <span class="hlt">events</span> will become more commonplace, potentially contributing to the accelerated degradation of Lake Tahoe's water clarity.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.6373A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.6373A"><span id="translatedtitle">Advanced Geospatial Hydrodynamic Signals <span class="hlt">Analysis</span> for Tsunami <span class="hlt">Event</span> Detection and Warning</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Arbab-Zavar, Banafshe; Sabeur, Zoheir</p> <p>2013-04-01</p> <p>Current early tsunami warning can be issued upon the detection of a seismic <span class="hlt">event</span> which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic <span class="hlt">event</span> from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source <span class="hlt">event</span>. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009pcms.confE..94A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009pcms.confE..94A"><span id="translatedtitle">The Strong Wind <span class="hlt">event</span> of 24th January 2009 in Catalonia: a social impact <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Amaro, J.; Aran, M.; Barberia, L.; Llasat, M. C.</p> <p>2009-09-01</p> <p>Although strong winds are frequent in Catalonia, one of the <span class="hlt">events</span> with the strongest impact in recent years was on January 24th 2009. An explosive cyclogenesis process took place in the Atlantic: pressure fell 30 hPa in less than 24 hours. The strong wind storm pounded the northern of Spain and the south of France with some fatalities and important economic losses in these regions. Several automatic weather stations recorded wind gusts higher than 100 km/h in Catalonia. Emergency services received more than 20.000 calls in 24 hours and there were 497 interventions in only 12 hours. As a consequence of fallen and uprooted trees railway and road infrastructures got damages and more than 30.000 customers had no electricity during 24 hours. Unfortunately there were a total of 6 fatalities, two of them because of fallen trees and the other ones when a sports centre collapsed over a group of children. In Spain, insurance policies cover damages due to strong winds when fixed thresholds are overcome and, according to the Royal Decree 300/2004 of 20th February, extraordinary risk are assumed by the Consorcio de Compensación de Seguros. Subsequently, Public Weather Services (PWS) had an increased on the number of requests received from people affected by this <span class="hlt">event</span> and from insurance companies, for the corresponding indemnity or not. As an example, during the first month after the <span class="hlt">event</span>, in the Servei Meteorològic de Catalunya (SMC) more than 600 requests were received only related to these damages (as an average PWS of SMC received a total of 400 requests per month). Following the research started by the Social Impact Research Group of MEDEX project, a good vulnerability indicator of a meteorological risk can be the number of requests reported. This study uses the information received in the PWS of the SMC during the six months after the <span class="hlt">event</span>, according the criteria and methodology established in Gayà et al (2008). The objective is to compare the vulnerability with the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008APS..OSF.P1028L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008APS..OSF.P1028L"><span id="translatedtitle">Kinematics from footprints: <span class="hlt">Analysis</span> of a possible dinosaur predation <span class="hlt">event</span> in the Cretaceous Era</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lee, Scott</p> <p>2008-10-01</p> <p>Motivation is enhanced by challenging students with interesting and open-ended questions. In this talk, a methodology for studying the locomotion of extinct animals based on their footprint trackways is developed and applied to a possible predation <span class="hlt">event</span> recorded in a Cretaceous Era deposit.ootnotetextJ.O. Farlow, ``Lower Cretaceous Dinosaur Tracks, Paluxy River Valley, Texas,'' South Central Geological Society of America, Baylor University, 1987. Students usually love learning about dinosaurs, an unexpected treat in a physics class. This example can be used in the classroom to help build critical thinking skills as the students decide whether the evidence supports a predation scenario or not.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/20633728','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/20633728"><span id="translatedtitle">Quasifree expansion picture of break-up <span class="hlt">events</span>: An <span class="hlt">analysis</span> of ionizing systems</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Errea, L.F.; Mendez, L.; Pons, B.; Riera, A.; Sevila, I.</p> <p>2003-02-01</p> <p>We derive some general characteristics of the wave function representing a break-up <span class="hlt">event</span>, in the asymptotic region. They have a strong bearing on the validity of some classical pictures, on the correlation between spatial and momentum variables that develops in the course of the dissociation process and on stringent requirements on the basis sets that are employed to approximate the wave function. Although other calculations are mentioned to underline the generality of our reasonings, we restrict most of the presentation, and all of the illustrations, to the case of ionization.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4916627','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4916627"><span id="translatedtitle">Incidence of adverse <span class="hlt">events</span> in paediatric procedural sedation in the emergency department: a systematic review and meta-<span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bellolio, M Fernanda; Puls, Henrique A; Anderson, Jana L; Gilani, Waqas I; Murad, M Hassan; Barrionuevo, Patricia; Erwin, Patricia J; Wang, Zhen; Hess, Erik P</p> <p>2016-01-01</p> <p>Objective and design We conducted a systematic review and meta-<span class="hlt">analysis</span> to evaluate the incidence of adverse <span class="hlt">events</span> in the emergency department (ED) during procedural sedation in the paediatric population. Randomised controlled trials and observational studies from the past 10 years were included. We adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Setting ED. Participants Children. Interventions Procedural sedation. Outcomes Adverse <span class="hlt">events</span> like vomiting, agitation, hypoxia and apnoea. Meta-<span class="hlt">analysis</span> was performed with random-effects model and reported as incidence rates with 95% CIs. Results A total of 1177 studies were retrieved for screening and 258 were selected for full-text review. 41 studies reporting on 13 883 procedural sedations in 13 876 children (≤18 years) were included. The most common adverse <span class="hlt">events</span> (all reported per 1000 sedations) were: vomiting 55.5 (CI 45.2 to 65.8), agitation 17.9 (CI 12.2 to 23.7), hypoxia 14.8 (CI 10.2 to 19.3) and apnoea 7.1 (CI 3.2 to 11.0). The need to intervene with either bag valve mask, oral airway or positive pressure ventilation occurred in 5.0 per 1000 sedations (CI 2.3 to 7.6). The incidences of severe respiratory <span class="hlt">events</span> were: 34 cases of laryngospasm among 8687 sedations (2.9 per 1000 sedations, CI 1.1 to 4.7; absolute rate 3.9 per 1000 sedations), 4 intubations among 9136 sedations and 0 cases of aspiration among 3326 sedations. 33 of the 34 cases of laryngospasm occurred in patients who received ketamine. Conclusions Serious adverse respiratory <span class="hlt">events</span> are very rare in paediatric procedural sedation in the ED. Emesis and agitation are the most frequent adverse <span class="hlt">events</span>. Hypoxia, a late indicator of respiratory depression, occurs in 1.5% of sedations. Laryngospasm, though rare, happens most frequently with ketamine. The results of this study provide quantitative risk estimates to facilitate shared decision-making, risk communication, informed consent and</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1510389M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1510389M"><span id="translatedtitle">Statistical <span class="hlt">analysis</span> of geomagnetic storms, coronal mass ejections and solar energetic particle <span class="hlt">events</span> in the framework of the COMESEP project</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Malandraki, Olga</p> <p>2013-04-01</p> <p>Geomagnetic storms and Solar Energetic Particle (SEP) radiation storms are hazards in space. It is important to mitigate the effects space weather phenomena may have on technology and human life. The aim of the EU FP7 COMESEP (Coronal Mass Ejections and Solar Energetic Particles) project is to develop forecasting tools both for geomagnetic and SEP storms, and relies on both models and data. This includes a statistical <span class="hlt">analysis</span> of geomagnetic storms and SEP <span class="hlt">events</span> during the SOHO era. The goal is to connect the impact of these phenomena with the associated Coronal Mass Ejection (CME) and/or solar flare characteristics. Results of these analyses are being implemented into the COMESEP space weather alert system that is being built based on the produced tools. For the <span class="hlt">analysis</span> of geomagnetic storms, a representative subset of CMEs from the LASCO/SOHO catalog is selected, and includes associations with Dst index values. The main objective is to determine the probability distributions of Dst and other relationships depending on the CME and flare characteristics. The effect of multiple CME occurrences on the probability of large Dst index values and the treatment of semiannual variations of storms are also evaluated. The <span class="hlt">analysis</span> of SEP <span class="hlt">events</span> focuses on the quantification of SEP occurrence probabilities and on the identification of correlations between SEPs and solar <span class="hlt">events</span>. Both quantities depend on the flare heliographic location, soft X-ray intensity, the CME speed and width. The SEP parameters studied include peak fluxes, fluences, spectral fit parameters and enhancements in heavy ion fluxes. A preliminary estimation of false alarms for our system based on the statistical <span class="hlt">analysis</span> used is under progress to asses the validity of the alerts. This work has received funding from the European Commission FP7 Project COMESEP (263252).</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26159942','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26159942"><span id="translatedtitle">Work stress and the risk of recurrent coronary heart disease <span class="hlt">events</span>: A systematic review and meta-<span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Jian; Zhang, Min; Loerbroks, Adrian; Angerer, Peter; Siegrist, Johannes</p> <p>2015-01-01</p> <p>Though much evidence indicates that work stress increases the risk of incident of coronary heart disease (CHD), little is known about the role of work stress in the development of recurrent CHD <span class="hlt">events</span>. The objective of this study was to review and synthesize the existing epidemiological evidence on whether work stress increases the risk of recurrent CHD <span class="hlt">events</span> in patients with the first CHD. A systematic literature search in the PubMed database (January 1990 - December 2013) for prospective studies was performed. Inclusion criteria included: peer-reviewed English papers with original data, studies with substantial follow-up (> 3 years), end points defined as cardiac death or nonfatal myocardial infarction, as well as work stress assessed with reliable and valid instruments. Meta-<span class="hlt">analysis</span> using random-effects modeling was conducted in order to synthesize the observed effects across the studies. Five papers derived from 4 prospective studies conducted in Sweden and Canada were included in this systematic review. The measurement of work stress was based on the Demand- Control model (4 papers) or the Effort-Reward Imbalance model (1 paper). According to the estimation by meta-<span class="hlt">analysis</span> based on 4 papers, a significant effect of work stress on the risk of recurrent CHD <span class="hlt">events</span> (hazard ratio: 1.65, 95% confidence interval: 1.23-2.22) was observed. Our findings suggest that, in patients with the first CHD, work stress is associated with an increased relative risk of recurrent CHD <span class="hlt">events</span> by 65%. Due to the limited literature, more well-designed prospective research is needed to examine this association, in particular, from other than western regions of the world.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/26159942','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/26159942"><span id="translatedtitle">Work stress and the risk of recurrent coronary heart disease <span class="hlt">events</span>: A systematic review and meta-<span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Jian; Zhang, Min; Loerbroks, Adrian; Angerer, Peter; Siegrist, Johannes</p> <p>2015-01-01</p> <p>Though much evidence indicates that work stress increases the risk of incident of coronary heart disease (CHD), little is known about the role of work stress in the development of recurrent CHD <span class="hlt">events</span>. The objective of this study was to review and synthesize the existing epidemiological evidence on whether work stress increases the risk of recurrent CHD <span class="hlt">events</span> in patients with the first CHD. A systematic literature search in the PubMed database (January 1990 - December 2013) for prospective studies was performed. Inclusion criteria included: peer-reviewed English papers with original data, studies with substantial follow-up (> 3 years), end points defined as cardiac death or nonfatal myocardial infarction, as well as work stress assessed with reliable and valid instruments. Meta-<span class="hlt">analysis</span> using random-effects modeling was conducted in order to synthesize the observed effects across the studies. Five papers derived from 4 prospective studies conducted in Sweden and Canada were included in this systematic review. The measurement of work stress was based on the Demand- Control model (4 papers) or the Effort-Reward Imbalance model (1 paper). According to the estimation by meta-<span class="hlt">analysis</span> based on 4 papers, a significant effect of work stress on the risk of recurrent CHD <span class="hlt">events</span> (hazard ratio: 1.65, 95% confidence interval: 1.23-2.22) was observed. Our findings suggest that, in patients with the first CHD, work stress is associated with an increased relative risk of recurrent CHD <span class="hlt">events</span> by 65%. Due to the limited literature, more well-designed prospective research is needed to examine this association, in particular, from other than western regions of the world. PMID:26159942</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18047528','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18047528"><span id="translatedtitle">Clustered mixed nonhomogeneous Poisson process spline models for the <span class="hlt">analysis</span> of recurrent <span class="hlt">event</span> panel data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Nielsen, J D; Dean, C B</p> <p>2008-09-01</p> <p>A flexible semiparametric model for analyzing longitudinal panel count data arising from mixtures is presented. Panel count data refers here to count data on recurrent <span class="hlt">events</span> collected as the number of <span class="hlt">events</span> that have occurred within specific follow-up periods. The model assumes that the counts for each subject are generated by mixtures of nonhomogeneous Poisson processes with smooth intensity functions modeled with penalized splines. Time-dependent covariate effects are also incorporated into the process intensity using splines. Discrete mixtures of these nonhomogeneous Poisson process spline models extract functional information from underlying clusters representing hidden subpopulations. The motivating application is an experiment to test the effectiveness of pheromones in disrupting the mating pattern of the cherry bark tortrix moth. Mature moths arise from hidden, but distinct, subpopulations and monitoring the subpopulation responses was of interest. Within-cluster random effects are used to account for correlation structures and heterogeneity common to this type of data. An estimating equation approach to inference requiring only low moment assumptions is developed and the finite sample properties of the proposed estimating functions are investigated empirically by simulation.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22478239','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22478239"><span id="translatedtitle">Reflections on some early <span class="hlt">events</span> related to behavior <span class="hlt">analysis</span> of child development.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bijou, S W</p> <p>1996-01-01</p> <p>A series of <span class="hlt">events</span> related to the early application of behavioral principles to child behavior and development is described. The <span class="hlt">events</span> began in the 1930s at Columbia University with a solicited letter from John B. Watson suggesting a master's degree thesis problem, and continued through the 1950s and 1960s at the University of Washington. Specifically, these happenings resulted in (a) research demonstrating that Skinner's laboratory method for studying nonhuman organisms could be profitably applied to the laboratory study of young normal children; (b) a demonstration that by successive approximations, a normal child can be operantly conditioned to respond to an arbitrary situation; (c) research showing that the effects of simple schedules of reinforcement obtained with nonhuman organisms could be duplicated in young normal and retarded children; (d) the demonstration that Skinner's operant laboratory method could be adapted to study young children in field situations; (e) research showing that operant principles can be successfully applied to the treatment of a young autistic boy with a serious visual handicap; (f) laboratory studies showing that mothers can be trained to treat their own young children who have behavior problems; (g) an in-home study demonstrating that a mother can treat her own child who has behavior problems; (h) a demonstration that operant principles can be applied effectively to teaching reading, writing, and arithmetic to children with retardation; and (i) publication of a book, Child Development: A Systematic and Empirical Theory, in collaboration with Donald M. Baer, by Prentice Hall in their Century Psychological Series.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23185635','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23185635"><span id="translatedtitle">Landscape-scale <span class="hlt">analysis</span> of wetland sediment deposition from four tropical cyclone <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tweel, Andrew W; Turner, R Eugene</p> <p>2012-01-01</p> <p>Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between <span class="hlt">events</span>. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone <span class="hlt">events</span> and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3503965','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3503965"><span id="translatedtitle">Landscape-Scale <span class="hlt">Analysis</span> of Wetland Sediment Deposition from Four Tropical Cyclone <span class="hlt">Events</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tweel, Andrew W.; Turner, R. Eugene</p> <p>2012-01-01</p> <p>Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between <span class="hlt">events</span>. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone <span class="hlt">events</span> and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils. PMID:23185635</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16206848','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16206848"><span id="translatedtitle">Suspended solids transport: an <span class="hlt">analysis</span> based on turbidity measurements and <span class="hlt">event</span> based fully calibrated hydrodynamic models.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Langeveld, J G; Veldkamp, R G; Clemens, F</p> <p>2005-01-01</p> <p>Modelling suspended solids transport is a key issue for predicting the pollution load discharged by CSOs. Nonetheless, there is still much debate on the main drivers for suspended solids transport and on the modelling approach to be adopted. Current sewer models provide suspended solids transport models. These models, however, rely upon erosion-deposition criteria developed in fluvial environments, therewith oversimplifying the sewer sediment characteristics. Consequently, the performance of these models is poor from a theoretical point of view. To get an improved understanding of the temporal and spatial variations in suspended solids transport, a measuring network was installed in the sewer system of Loenen in conjunction with a hydraulic measuring network from June through December 2001. During the measuring period, 15 storm <span class="hlt">events</span> rendered high-quality data on both the hydraulics and the turbidity. For each storm <span class="hlt">event</span>, a hydrodynamic model was calibrated using the Clemens' method. The conclusion of the paper is that modelling of suspended solids transport has been and will be one of the challenges in the field of urban drainage modelling. A direct relation of either shear stress or flow velocity with turbidity could not be found, likely because of the time varying characteristics of the suspended solids.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhRvD..94b3009V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhRvD..94b3009V"><span id="translatedtitle"><span class="hlt">Analysis</span> of the 4-year IceCube high-energy starting <span class="hlt">events</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vincent, Aaron C.; Palomares-Ruiz, Sergio; Mena, Olga</p> <p>2016-07-01</p> <p>After four years of data taking, the IceCube neutrino telescope has detected 54 high-energy starting <span class="hlt">events</span> (HESE, or contained-vertex <span class="hlt">events</span>) with deposited energies above 20 TeV. They represent the first detection of high-energy extraterrestrial neutrinos and, therefore, the first step in neutrino astronomy. To study the energy, flavor, and isotropy of the astrophysical neutrino flux arriving at Earth, we perform different analyses of two different deposited energy intervals, [10 TeV-10 PeV] and [60 TeV-10 PeV]. We first consider an isotropic unbroken power-law spectrum and constrain its shape, normalization, and flavor composition. Our results are in agreement with the preliminary IceCube results, although we obtain a slightly softer spectrum. We also find that current data are not sensitive to a possible neutrino-antineutrino asymmetry in the astrophysical flux. Then, we show that although a two-component power-law model leads to a slightly better fit, it does not represent a significant improvement with respect to a single power-law flux. Finally, we analyze the possible existence of a north-south asymmetry, hinted at by the combination of the HESE sample with the throughgoing muon data. If we use only HESE data, the scarce statistics from the Northern Hemisphere does not allow us to reach any conclusive answer, which indicates that the HESE sample alone is not driving the potential north-south asymmetry.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/23185635','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/23185635"><span id="translatedtitle">Landscape-scale <span class="hlt">analysis</span> of wetland sediment deposition from four tropical cyclone <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tweel, Andrew W; Turner, R Eugene</p> <p>2012-01-01</p> <p>Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between <span class="hlt">events</span>. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone <span class="hlt">events</span> and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils. PMID:23185635</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://ntrs.nasa.gov/search.jsp?R=19820006379&hterms=feynman&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Dfeynman','NASA-TRS'); return false;" href="http://ntrs.nasa.gov/search.jsp?R=19820006379&hterms=feynman&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Dfeynman"><span id="translatedtitle">Three-dimensional <span class="hlt">analysis</span> of charging <span class="hlt">events</span> on days 87 and 114, 1979, from SCATHA</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Saflekos, N. A.; Tautz, M. F.; Rubin, A. G.; Hardy, D. A.; Mizera, P. F.; Feynman, J.</p> <p>1980-01-01</p> <p>Angular distributions of ions and electrons from the Spacecraft Charging at High Altitudes (SCATHA) were investigated for the floating potential and the differential charging of the spacecraft as deduced from Liouville's theorem. The following was found: (1) short time charging <span class="hlt">events</span> on the spacecraft are associated with short time increases of the intensity of 10 keV to 1 MeV electrons; (2) short time changes of the spacecraft differential potential are associated with simultaneous short time changes of the spacecraft floating potential; (3) solar UV intensities in penumbra anticorrelate with the spacecraft floating potentials; (4) NASCAP predicts correct forms of sunshade asymmetric surface potentials; (5) certain enhancements of the intensity of energetic ions diminishes the absolute value of the spacecraft surface potential; (6) spacecraft discharging <span class="hlt">events</span> in times shorter than 20 sec did not change in the spectrum of the energetic plasma; (7) partial discharging of the spacecraft occurred upon entry into a magnetically depleted region; and (8) steady state potentials and transient potentials of duration less than 30 seconds are simulated by the NASCAP code.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20853937','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20853937"><span id="translatedtitle">The link between alcohol use and aggression toward sexual minorities: an <span class="hlt">event</span>-based <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Parrott, Dominic J; Gallagher, Kathryn E; Vincent, Wilson; Bakeman, Roger</p> <p>2010-09-01</p> <p>The current study used an <span class="hlt">event</span>-based assessment approach to examine the day-to-day relationship between heterosexual men's alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18-30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexual minorities. Results indicated that aggression toward sexual minorities was twice as likely on a day when drinking was reported than on nondrinking days, with over 80% of alcohol-related aggressive acts perpetrated within the group context. Patterns of alcohol use (i.e., number of drinking days, mean drinks per drinking day, number of heavy drinking days) were not associated with perpetration after controlling for demographic variables and pertinent risk factors. Results suggest that it is the acute effects of alcohol, and not men's patterns of alcohol consumption, that facilitate aggression toward sexual minorities. More importantly, these data are the first to support an <span class="hlt">event</span>-based link between alcohol use and aggression toward sexual minorities (or any minority group), and provide the impetus for future research to examine risk factors and mechanisms for intoxicated aggression toward sexual minorities and other stigmatized groups.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25807376','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25807376"><span id="translatedtitle">Words <span class="hlt">analysis</span> of online Chinese news headlines about trending <span class="hlt">events</span>: a complex network perspective.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan</p> <p>2015-01-01</p> <p>Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending <span class="hlt">events</span> in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular <span class="hlt">event</span> and therefore track the evolution of news deeply and rapidly.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4373827','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4373827"><span id="translatedtitle">Words <span class="hlt">Analysis</span> of Online Chinese News Headlines about Trending <span class="hlt">Events</span>: A Complex Network Perspective</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan</p> <p>2015-01-01</p> <p>Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines’ keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending <span class="hlt">events</span> in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words’ networks based on news headlines that can account for all the keywords in the news about a particular <span class="hlt">event</span> and therefore track the evolution of news deeply and rapidly. PMID:25807376</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/25069255','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/25069255"><span id="translatedtitle">[<span class="hlt">Analysis</span> of the cardiac side effects of antipsychotics: Japanese Adverse Drug <span class="hlt">Event</span> Report Database (JADER)].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ikeno, Takashi; Okumara, Yasuyuki; Kugiyama, Kiyotaka; Ito, Hiroto</p> <p>2013-08-01</p> <p>We analyzed the cases of side effects due to antipsychotics reported to Japan's Pharmaceuticals and Medical Devices Agency (PMDA) from Jan. 2004 to Dec. 2012. We used the Japanese Adverse Drug <span class="hlt">Event</span> Report Database (JADER) and analyzed 136 of 216,945 cases using the defined terms. We also checked the cardiac adverse effects listed in the package inserts of the antipsychotics involved. We found cases of Ikr blockade resulting in sudden death (49 cases), electrocardiogram QT prolonged (29 cases), torsade de pointes (TdP, 19 cases), ventricular fibrillation (VF, 10 cases). M2 receptor blockade was observed in tachycardia (8 cases) and sinus tachycardia (3 cases). Calmodulin blockade was involved in reported cardiomyopathy (3 cases) and myocarditis (1 case). Multiple adverse <span class="hlt">events</span> were reported simultaneously in 14 cases. Our search of package inserts revealed warnings regarding electrocardiogram QT prolongation (24 drugs), tachycardia (23), sudden death (18), TdP (14), VF (3), myocarditis (1) and cardiomyopathy (1). We suggest that when an antipsychotic is prescribed, the patient should be monitored regularly with ECG, blood tests, and/or biochemical tests to avoid adverse cardiac effects. PMID:25069255</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27245021','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27245021"><span id="translatedtitle">[Incidence rate of adverse reaction/<span class="hlt">event</span> by Qingkailing injection: a Meta-<span class="hlt">analysis</span> of single rate].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing</p> <p>2015-12-01</p> <p>To systematically review the incidence rate of adverse drug reaction/<span class="hlt">event</span> by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-<span class="hlt">analysis</span> was performed by using the R 3.2.0 software, subgroup sensitivity <span class="hlt">analysis</span> was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/<span class="hlt">events</span> were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/<span class="hlt">events</span>, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/<span class="hlt">event</span> type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup <span class="hlt">analysis</span> showed that different types of adverse reactions, combination in the incidence of adverse reactions/<span class="hlt">events</span> were higher than that of single drug, the difference was statistically significant (P < 0.05). This study suggested the influence factors of adverse reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/27245021','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/27245021"><span id="translatedtitle">[Incidence rate of adverse reaction/<span class="hlt">event</span> by Qingkailing injection: a Meta-<span class="hlt">analysis</span> of single rate].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing</p> <p>2015-12-01</p> <p>To systematically review the incidence rate of adverse drug reaction/<span class="hlt">event</span> by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-<span class="hlt">analysis</span> was performed by using the R 3.2.0 software, subgroup sensitivity <span class="hlt">analysis</span> was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/<span class="hlt">events</span> were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/<span class="hlt">events</span>, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/<span class="hlt">event</span> type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup <span class="hlt">analysis</span> showed that different types of adverse reactions, combination in the incidence of adverse reactions/<span class="hlt">events</span> were higher than that of single drug, the difference was statistically significant (P < 0.05). This study suggested the influence factors of adverse reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication. PMID:27245021</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2010EGUGA..12.9742D&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2010EGUGA..12.9742D&link_type=ABSTRACT"><span id="translatedtitle">Rainfall estimation in the context of post-<span class="hlt">event</span> flash flood <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Delrieu, Guy; Boudevillain, Brice; Bouilloud, Ludovic</p> <p>2010-05-01</p> <p>Due to their spatial coverage and space-time resolution, operational weather radar networks offer unprecedented opportunities for the observation of flash flood generating storms. However, the radar rainfall estimation quality highly depends on the relative locations of the <span class="hlt">event</span> and the radar(s). A mountainous environment obviously adds to the complexity of the radar quantitative precipitation estimation (QPE). A pragmatic methodology was developed within the EC-funded HYDRATE project to take the best benefit of the existing rainfall observations (radar and raingauge data) for given flash-flood cases: 1) A precise documentation of the radar characteristics (location, parameters, operating protocol, data archives and processing) needs first to be established. The radar(s) detection domain(s) can then be characterized using the "hydrologic visibility" concepts (Pellarin et al. J Hydrometeor 3(5) 539-555 2002). 2) Rather dense raingauge observations (operational, amateur) are usually available at the <span class="hlt">event</span> time scale while few raingauge time series exist at the hydrologic time steps. Such raingauge datasets need to be critically analysed; a geostatistical approach is proposed for this task. 3) A number of identifications can be implemented prior to the radar data re-processing: a) Special care needs to be paid to (residual) ground clutter which has a dramatic impact of radar QPE. Dry-weather maps and rainfall accumulation maps may help in this task. b) Various sources of power losses such as screening, wet radome, attenuation in rain need to be identified and quantified. It will be shown that mountain returns can be used to quantify attenuation effects at C-band. c) Radar volume data is required to characterize the vertical profile of reflectivity (VPR), eventually conditioned on rain type (convective, widespread). When such data is not available, knowledge of the 0°C isotherm and the scanning protocol may help detecting bright-band contaminations that critically</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2009EGUGA..11.3209B&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2009EGUGA..11.3209B&link_type=ABSTRACT"><span id="translatedtitle">Rainfall estimation in the context of post-<span class="hlt">event</span> flash flood <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bouilloud, L.; Delrieu, G.; Boudevillain, B.</p> <p>2009-04-01</p> <p>Due to their spatial coverage and space-time resolution, operational weather radar networks offer unprecedented opportunities for the observation of flash flood generating storms. However, the radar rainfall estimation quality highly depends on the relative locations of the <span class="hlt">event</span> and the radar(s). A mountainous environment obviously adds to the complexity of the radar quantitative precipitation estimation (QPE). A pragmatic methodology is proposed to take the best benefit of the existing rainfall observations (radar and raingauge data) for given flash-flood cases: 1) A precise documentation of the radar characteristics (location, parameters, operating protocol, data archives and processing) needs first to be established. The radar(s) detection domain(s) can then be characterized using the "hydrologic visibility" concepts (Pellarin et al. J Hydrometeor 3(5) 539-555 2002). 2) Rather dense raingauge observations (operational, amateur) are usually available at the <span class="hlt">event</span> time scale while few raingauge time series exist at the hydrologic time steps. Such raingauge datasets need to be critically analysed; a geostatistical approach is proposed for this task. 3) A number of identifications can be implemented prior to the radar data re-processing: a) Special care needs to be paid to (residual) ground clutter which has a dramatic impact of radar QPE. Dry-weather maps and rainfall accumulation maps may help in this task. b) Various sources of power losses such as screening, wet radome, attenuation in rain need to be identified and quantified. It will be shown that mountain returns can be used to quantify attenuation effects at C-band. c) Radar volume data is required to characterize the vertical profile of reflectivity (VPR), eventually conditioned on rain type (convective, widespread). When such data is not available, knowledge of the 0°C isotherm and the scanning protocol may help detecting bright-band contaminations that critically affect radar QPE. d) With conventional</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2893089','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2893089"><span id="translatedtitle">Triple antiplatelet therapy for preventing vascular <span class="hlt">events</span>: a systematic review and meta-<span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2010-01-01</p> <p>Background Dual antiplatelet therapy is usually superior to mono therapy in preventing recurrent vascular <span class="hlt">events</span> (VEs). This systematic review assesses the safety and efficacy of triple antiplatelet therapy in comparison with dual therapy in reducing recurrent vascular <span class="hlt">events</span>. Methods Completed randomized controlled trials investigating the effect of triple versus dual antiplatelet therapy in patients with ischaemic heart disease (IHD), cerebrovascular disease or peripheral vascular disease were identified using electronic bibliographic searches. Data were extracted on composite VEs, myocardial infarction (MI), stroke, death and bleeding and analysed with Cochrane Review Manager software. Odds ratios (OR) and 95% confidence intervals (CI) were calculated using random effects models. Results Twenty-five completed randomized trials (17,383 patients with IHD) were included which involving the use of intravenous (iv) GP IIb/IIIa inhibitors (abciximab, eptifibatide, tirofiban), aspirin, clopidogrel and/or cilostazol. In comparison with aspirin-based therapy, triple therapy using an intravenous GP IIb/IIIa inhibitor significantly reduced composite VEs and MI in patients with non-ST elevation acute coronary syndromes (NSTE-ACS) (VE: OR 0.69, 95% CI 0.55-0.86; MI: OR 0.70, 95% CI 0.56-0.88) and ST elevation myocardial infarction (STEMI) (VE: OR 0.39, 95% CI 0.30-0.51; MI: OR 0.26, 95% CI 0.17-0.38). A significant reduction in death was also noted in STEMI patients treated with GP IIb/IIIa based triple therapy (OR 0.69, 95% CI 0.49-0.99). Increased minor bleeding was noted in STEMI and elective percutaneous coronary intervention (PCI) patients treated with GP IIb/IIIa based triple therapy. Stroke <span class="hlt">events</span> were too infrequent for us to be able to identify meaningful trends and no data were available for patients recruited into trials on the basis of stroke or peripheral vascular disease. Conclusions Triple antiplatelet therapy based on iv GPIIb/IIIa inhibitors was more</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/22252158','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/22252158"><span id="translatedtitle">Statistical <span class="hlt">analysis</span> of variations in impurity ion heating at reconnection <span class="hlt">events</span> in the Madison Symmetric Torus</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Cartolano, M. S.; Craig, D.; Den Hartog, D. J.; Kumar, S. T. A.; Nornberg, M. D.</p> <p>2014-01-15</p> <p>The connection between impurity ion heating and other physical processes in the plasma is evaluated by studying variations in the amount of ion heating at reconnection <span class="hlt">events</span> in the Madison Symmetric Torus (MST). Correlation of the change in ion temperature with individual tearing mode amplitudes indicates that the edge-resonant modes are better predictors for the amount of global ion heating than the core-resonant modes. There is also a strong correlation between ion heating and current profile relaxation. Simultaneous measurements of the ion temperature at different toroidal locations reveal, for the first time, a toroidal asymmetry to the ion heating in MST. These results present challenges for existing heating theories and suggest a stronger connection between edge-resonant tearing modes, current profile relaxation, and ion heating than has been previously thought.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23366957','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23366957"><span id="translatedtitle"><span class="hlt">Analysis</span> of extrinsic and intrinsic factors affecting <span class="hlt">event</span> related desynchronization production.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Takata, Yohei; Kondo, Toshiyuki; Saeki, Midori; Izawa, Jun; Takeda, Kotaro; Otaka, Yohei; It, Koji</p> <p>2012-01-01</p> <p>Recently there has been an increase in the number of stroke patients with motor paralysis. Appropriate re-afferent sensory feedback synchronized with a voluntary motor intention would be effective for promoting neural plasticity in the stroke rehabilitation. Therefore, BCI technology is considered to be a promising approach in the neuro-rehabilitation. To estimate human motor intention, an <span class="hlt">event</span>-related desynchronization (ERD), a feature of electroencephalogram (EEG) evoked by motor execution or motor imagery is usually used. However, there exists various factors that affect ERD production, and its neural mechanism is still an open question. As a preliminary stage, we evaluate mutual effects of intrinsic (voluntary motor imagery) and extrinsic (visual and somatosensory stimuli) factors on the ERD production. Experimental results indicate that these three factors are not always additively interacting with each other and affecting the ERD production.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18664726','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18664726"><span id="translatedtitle">[<span class="hlt">Analysis</span> of policies in activating the Infectious Disease Specialist Network (IDSN) for bioterrorism <span class="hlt">events</span>].</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kim, Yang Soo</p> <p>2008-07-01</p> <p>Bioterrorism <span class="hlt">events</span> have worldwide impacts, not only in terms of security and public health policy, but also in other related sectors. Many countries, including Korea, have set up new administrative and operational structures and adapted their preparedness and response plans in order to deal with new kinds of threats. Korea has dual surveillance systems for the early detection of bioterrorism. The first is syndromic surveillance that typically monitors non-specific clinical information that may indicate possible bioterrorism-associated diseases before specific diagnoses are made. The other is infectious disease specialist network that diagnoses and responds to specific illnesses caused by intentional release of biologic agents. Infectious disease physicians, clinical microbiologists, and infection control professionals play critical and complementary roles in these networks. Infectious disease specialists should develop practical and realistic response plans for their institutions in partnership with local and state health departments, in preparation for a real or suspected bioterrorism attack.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17428067','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17428067"><span id="translatedtitle">Temporal sequence of cell wall disassembly <span class="hlt">events</span> in developing fruits. 1. <span class="hlt">Analysis</span> of raspberry (Rubus idaeus).</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Vicente, Ariel R; Ortugno, Claudia; Powell, Ann L T; Greve, L Carl; Labavitch, John M</p> <p>2007-05-16</p> <p>Raspberry fruits were harvested at five developmental stages, from green to red ripe, and the changes in cell wall composition, pectin and hemicellulose solubilization, and depolymerization were analyzed. Fruit softening at intermediate stages of ripening was associated with increased pectin solubilization, which occurred without depolymerization. Arabinose was found to be the most abundant noncellulosic neutral sugar in the cell wall and showed dramatic solubilization late in ripening. No changes in pectin molecular size were observed even at the 100% red stage. Subsequently, as fruit became fully ripe a dramatic depolymerization occurred. In contrast, the hemicellulosic fractions showed no significant changes in content or polymer size during ripening. The paper discusses the sequence of <span class="hlt">events</span> leading to cell wall disassembly in raspberry fruit.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016ASPC..504..115S&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016ASPC..504..115S&link_type=ABSTRACT"><span id="translatedtitle">High Cadence Observations and <span class="hlt">Analysis</span> of Spicular-type <span class="hlt">Events</span> Using CRISP Onboard SST</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shetye, J.; Doyle, J. G.; Scullion, E.; Nelson, C. J.; Kuridze, D.</p> <p>2016-04-01</p> <p>We present spectroscopic and imaging observations of apparent ultra-fast spicule-like features observed with CRisp Imaging SpectroPolarimeter (CRISP) at the Swedish 1-m Solar Telescope (SST). The data shows spicules with an apparent velocity above 500 km s-1, very short lifetimes of up to 20 s and length/height around 3500 km. The spicules are seen as dark absorption structures in the Hα wings ±516 mÅ, ±774 mÅ and ±1032 mÅ which suddenly appear and disappear from the FOV. These features show a time delay in their appearance in the blue and red wings by 3-5 s. We suggest that their appearance/disappearance is due to their Doppler motion in and out of the 60 mÅ filter. See Fig. 1 for the evolution of the <span class="hlt">event</span> at two line positions.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/22478239','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/22478239"><span id="translatedtitle">Reflections on some early <span class="hlt">events</span> related to behavior <span class="hlt">analysis</span> of child development.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bijou, S W</p> <p>1996-01-01</p> <p>A series of <span class="hlt">events</span> related to the early application of behavioral principles to child behavior and development is described. The <span class="hlt">events</span> began in the 1930s at Columbia University with a solicited letter from John B. Watson suggesting a master's degree thesis problem, and continued through the 1950s and 1960s at the University of Washington. Specifically, these happenings resulted in (a) research demonstrating that Skinner's laboratory method for studying nonhuman organisms could be profitably applied to the laboratory study of young normal children; (b) a demonstration that by successive approximations, a normal child can be operantly conditioned to respond to an arbitrary situation; (c) research showing that the effects of simple schedules of reinforcement obtained with nonhuman organisms could be duplicated in young normal and retarded children; (d) the demonstration that Skinner's operant laboratory method could be adapted to study young children in field situations; (e) research showing that operant principles can be successfully applied to the treatment of a young autistic boy with a serious visual handicap; (f) laboratory studies showing that mothers can be trained to treat their own young children who have behavior problems; (g) an in-home study demonstrating that a mother can treat her own child who has behavior problems; (h) a demonstration that operant principles can be applied effectively to teaching reading, writing, and arithmetic to children with retardation; and (i) publication of a book, Child Development: A Systematic and Empirical Theory, in collaboration with Donald M. Baer, by Prentice Hall in their Century Psychological Series. PMID:22478239</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2006AGUFM.V41A1695W&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2006AGUFM.V41A1695W&link_type=ABSTRACT"><span id="translatedtitle">Polarization <span class="hlt">Analysis</span> of the September 2005 Northern Cascadia Episodic Tremor and Slip <span class="hlt">Event</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wech, A. G.; Creager, K. C.</p> <p>2006-12-01</p> <p>The region of Northern Cascadia, extending from the Olympic Mountains and Puget Sound to southern Vancouver Island, down-dip of the subduction "locked" zone has repeatedly experienced episodes of slow slip. This episodic slip, observed to take place over a period of two to several weeks, is accompanied by a seismic tremor signal. Based on the average recurrence interval of 14 months, the last episodic tremor and slip (ETS) <span class="hlt">event</span> was expected to occur in September, 2005. Indeed, it began on September 3. In order to record this <span class="hlt">event</span>, we deployed an array of 11 three-component seismometers on the northern side of the Olympic Peninsula augmenting Pacific Northwest Seismographic Network stations as well as the first few EarthScope BigFoot stations and Plate Boundary Observatory borehole seismometers. This seismic array was comprised of six short-period and five broadband instruments with average spacings of 500 m and 2200 m respectively. In conjunction with this Earthscope seismic deployment, we also installed a dense network of 29 temporary, continuous GPS stations across the entire Olympic Peninsula to integrate seismic and geodetic observations. Based on past geodetic observations, a dominant assumption for the source of tremor is fault-slip in the direction of subduction, which can be tested using polarization of the seismic tremor. Using waveform cross- correlation to invert for the direction of slowness, we observed the tremor signal to migrate directly under our array. As the source passed beneath the array, tremor polarization stabilized to coincide with the direction of subduction. During a four day period starting September 8, the normalized eigenvalue associated with the dominant linear polarization jumped from ~0.7 to a stable 0.9 value. Also during this time, the polarization azimuth stabilized to a value of 57 +/- 8 degrees, close to the angle of subduction (56 degrees) suggesting that the tremor is caused by slip in the direction of relative plate motion</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2733596','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2733596"><span id="translatedtitle">Reflections on some early <span class="hlt">events</span> related to behavior <span class="hlt">analysis</span> of child development</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bijou, Sidney W.</p> <p>1996-01-01</p> <p>A series of <span class="hlt">events</span> related to the early application of behavioral principles to child behavior and development is described. The <span class="hlt">events</span> began in the 1930s at Columbia University with a solicited letter from John B. Watson suggesting a master's degree thesis problem, and continued through the 1950s and 1960s at the University of Washington. Specifically, these happenings resulted in (a) research demonstrating that Skinner's laboratory method for studying nonhuman organisms could be profitably applied to the laboratory study of young normal children; (b) a demonstration that by successive approximations, a normal child can be operantly conditioned to respond to an arbitrary situation; (c) research showing that the effects of simple schedules of reinforcement obtained with nonhuman organisms could be duplicated in young normal and retarded children; (d) the demonstration that Skinner's operant laboratory method could be adapted to study young children in field situations; (e) research showing that operant principles can be successfully applied to the treatment of a young autistic boy with a serious visual handicap; (f) laboratory studies showing that mothers can be trained to treat their own young children who have behavior problems; (g) an in-home study demonstrating that a mother can treat her own child who has behavior problems; (h) a demonstration that operant principles can be applied effectively to teaching reading, writing, and arithmetic to children with retardation; and (i) publication of a book, Child Development: A Systematic and Empirical Theory, in collaboration with Donald M. Baer, by Prentice Hall in their Century Psychological Series. PMID:22478239</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/1009336','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/1009336"><span id="translatedtitle">Assessing potential impacts associated with contamination <span class="hlt">events</span> in water distribution systems : a sensitivity <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Davis, M. J.; Janke, R.; Taxon, T. N.</p> <p>2010-11-01</p> <p>An understanding of the nature of the adverse effects that could be associated with contamination <span class="hlt">events</span> in water distribution systems is necessary for carrying out vulnerability analyses and designing contamination warning systems. This study examines the adverse effects of contamination <span class="hlt">events</span> using models for 12 actual water systems that serve populations ranging from about 104 to over 106 persons. The measure of adverse effects that we use is the number of people who are exposed to a contaminant above some dose level due to ingestion of contaminated tap water. For this study the number of such people defines the impact associated with an <span class="hlt">event</span>. We consider a wide range of dose levels in order to accommodate a wide range of potential contaminants. For a particular contaminant, dose level can be related to a health effects level. For example, a dose level could correspond to the median lethal dose, i.e., the dose that would be fatal to 50% of the exposed population. Highly toxic contaminants may be associated with a particular response at a very low dose level, whereas contaminants with low toxicity may only be associated with the same response at a much higher dose level. This report focuses on the sensitivity of impacts to five factors that either define the nature of a contamination <span class="hlt">event</span> or involve assumptions that are used in assessing exposure to the contaminant: (1) duration of contaminant injection, (2) time of contaminant injection, (3) quantity or mass of contaminant injected, (4) population distribution in the water distribution system, and (5) the ingestion pattern of the potentially exposed population. For each of these factors, the sensitivities of impacts to injection location and contaminant toxicity are also examined. For all the factors considered, sensitivity tends to increase with dose level (i.e., decreasing toxicity) of the contaminant, with considerable inter-network variability. With the exception of the population distribution (factor 4</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2850216','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2850216"><span id="translatedtitle">Arrests, Recent Life Circumstances, and Recurrent Job Loss for At-Risk Young Men: An <span class="hlt">Event</span>-History <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wiesner, Margit; Capaldi, Deborah M.; Kim, Hyoun K.</p> <p>2009-01-01</p> <p>This study used longitudinal data from 202 at-risk young men to examine effects of arrests, prior risk factors, and recent life circumstances on job loss across a 7-year period in early adulthood. Repeated failure-time continuous <span class="hlt">event</span>-history <span class="hlt">analysis</span> indicated that occurrence of job loss was primarily related to prior mental health problems, recent arrests, recent drug use, and recent being married/cohabitation. It is argued that long-term effects of criminal justice contact on employment outcomes should be understood in the context of (shared) prior risk factors and recent life circumstances. PMID:20383311</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/10167026','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/10167026"><span id="translatedtitle">Workshop on the use of PRA methodology for the <span class="hlt">analysis</span> of reactor <span class="hlt">events</span> and operational data: Proceedings</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Rasmuson, D.M.; Dingman, S.</p> <p>1992-06-01</p> <p>A workshop entitled ``The Use of PRA Methodology for the <span class="hlt">Analysis</span> of Reactor <span class="hlt">Events</span> and Operational Data`` was held on January 29--30, 1992 in Annapolis, Maryland. Over 50 participants from the NRC, its contractors, and others participated in the meetings. During the first day, presentations were made by invited speakers to discuss issues in relevant topics. On the second day, discussion groups were held to focus on three areas: risk significance of operational <span class="hlt">events</span>, industry risk profile and generic concerns, and risk monitoring and risk-based performance indicators. Important considerations identified from the workshop are the following: Improve the Accident Sequence Precursor models and data. Improve the SCSS and NPRDS (e.g., by adding detailed performance information on selected components, by improving narratives on failure causes). Develop risk-based performance indicators. Use risk insights to help focus trending and performance analyses of components, systems, initiators, and sequences. Improve the statistical quality of trending and performance analyses. Flag implications of special conditions (e.g., external <span class="hlt">events</span>, containment performance) during data studies. Trend common cause and human performance using appropriate models to obtain a better understanding of the impact and causes of failure. Develop a method for producing an industry risk profile.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/7232908','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/7232908"><span id="translatedtitle">Workshop on the use of PRA methodology for the <span class="hlt">analysis</span> of reactor <span class="hlt">events</span> and operational data: Proceedings</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Rasmuson, D.M. . Div. of Safety Programs); Dingman, S. )</p> <p>1992-06-01</p> <p>A workshop entitled The Use of PRA Methodology for the <span class="hlt">Analysis</span> of Reactor <span class="hlt">Events</span> and Operational Data'' was held on January 29--30, 1992 in Annapolis, Maryland. Over 50 participants from the NRC, its contractors, and others participated in the meetings. During the first day, presentations were made by invited speakers to discuss issues in relevant topics. On the second day, discussion groups were held to focus on three areas: risk significance of operational <span class="hlt">events</span>, industry risk profile and generic concerns, and risk monitoring and risk-based performance indicators. Important considerations identified from the workshop are the following: Improve the Accident Sequence Precursor models and data. Improve the SCSS and NPRDS (e.g., by adding detailed performance information on selected components, by improving narratives on failure causes). Develop risk-based performance indicators. Use risk insights to help focus trending and performance analyses of components, systems, initiators, and sequences. Improve the statistical quality of trending and performance analyses. Flag implications of special conditions (e.g., external <span class="hlt">events</span>, containment performance) during data studies. Trend common cause and human performance using appropriate models to obtain a better understanding of the impact and causes of failure. Develop a method for producing an industry risk profile.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/biblio/22348021','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/biblio/22348021"><span id="translatedtitle">A super-jupiter orbiting a late-type star: A refined <span class="hlt">analysis</span> of microlensing <span class="hlt">event</span> OGLE-2012-BLG-0406</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Tsapras, Y.; Street, R. A.; Choi, J.-Y.; Han, C.; Bozza, V.; Gould, A.; Dominik, M.; Browne, P.; Horne, K.; Hundertmark, M.; Beaulieu, J.-P.; Udalski, A.; Jørgensen, U. G.; Sumi, T.; Bramich, D. M.; Kains, N.; Ipatov, S.; Alsubai, K. A.; Snodgrass, C.; Steele, I. A.; Collaboration: RoboNet Collaboration; MiNDSTEp Collaboration; OGLE Collaboration; PLANET Collaboration; μFUN Collaboration; MOA Collaboration; and others</p> <p>2014-02-10</p> <p>We present a detailed <span class="hlt">analysis</span> of survey and follow-up observations of microlensing <span class="hlt">event</span> OGLE-2012-BLG-0406 based on data obtained from 10 different observatories. Intensive coverage of the light curve, especially the perturbation part, allowed us to accurately measure the parallax effect and lens orbital motion. Combining our measurement of the lens parallax with the angular Einstein radius determined from finite-source effects, we estimate the physical parameters of the lens system. We find that the <span class="hlt">event</span> was caused by a 2.73 ± 0.43 M {sub J} planet orbiting a 0.44 ± 0.07 M {sub ☉} early M-type star. The distance to the lens is 4.97 ± 0.29 kpc and the projected separation between the host star and its planet at the time of the <span class="hlt">event</span> is 3.45 ± 0.26 AU. We find that the additional coverage provided by follow-up observations, especially during the planetary perturbation, leads to a more accurate determination of the physical parameters of the lens.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/24486355','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/24486355"><span id="translatedtitle">A methodology for interactive mining and visual <span class="hlt">analysis</span> of clinical <span class="hlt">event</span> patterns using electronic health record data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gotz, David; Wang, Fei; Perer, Adam</p> <p>2014-04-01</p> <p>Patients' medical conditions often evolve in complex and seemingly unpredictable ways. Even within a relatively narrow and well-defined episode of care, variations between patients in both their progression and eventual outcome can be dramatic. Understanding the patterns of <span class="hlt">events</span> observed within a population that most correlate with differences in outcome is therefore an important task in many types of studies using retrospective electronic health data. In this paper, we present a method for interactive pattern mining and <span class="hlt">analysis</span> that supports ad hoc visual exploration of patterns mined from retrospective clinical patient data. Our approach combines (1) visual query capabilities to interactively specify episode definitions, (2) pattern mining techniques to help discover important intermediate <span class="hlt">events</span> within an episode, and (3) interactive visualization techniques that help uncover <span class="hlt">event</span> patterns that most impact outcome and how those associations change over time. In addition to presenting our methodology, we describe a prototype implementation and present use cases highlighting the types of insights or hypotheses that our approach can help uncover. PMID:24486355</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24486355','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24486355"><span id="translatedtitle">A methodology for interactive mining and visual <span class="hlt">analysis</span> of clinical <span class="hlt">event</span> patterns using electronic health record data.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gotz, David; Wang, Fei; Perer, Adam</p> <p>2014-04-01</p> <p>Patients' medical conditions often evolve in complex and seemingly unpredictable ways. Even within a relatively narrow and well-defined episode of care, variations between patients in both their progression and eventual outcome can be dramatic. Understanding the patterns of <span class="hlt">events</span> observed within a population that most correlate with differences in outcome is therefore an important task in many types of studies using retrospective electronic health data. In this paper, we present a method for interactive pattern mining and <span class="hlt">analysis</span> that supports ad hoc visual exploration of patterns mined from retrospective clinical patient data. Our approach combines (1) visual query capabilities to interactively specify episode definitions, (2) pattern mining techniques to help discover important intermediate <span class="hlt">events</span> within an episode, and (3) interactive visualization techniques that help uncover <span class="hlt">event</span> patterns that most impact outcome and how those associations change over time. In addition to presenting our methodology, we describe a prototype implementation and present use cases highlighting the types of insights or hypotheses that our approach can help uncover.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2005AdG.....2..301T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2005AdG.....2..301T"><span id="translatedtitle"><span class="hlt">Analysis</span> of infiltration, seepage processes and slope instability mechanisms during the November 2000 storm <span class="hlt">event</span> in Tuscany</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tofani, V.; Dapporto, S.; Vannocci, P.; Casagli, N.</p> <p>2005-09-01</p> <p>On the days 20-21 November 2000, a storm of exceptional intensity triggered over 50 landslides within the province of Pistoia in Tuscany (Italy). These failures are mostly of complex type, originating as rotational or translational landslides, and transforming into flows. Two of these landslides were investigated in this paper by modelling the ground water infiltration process, the pore water pressure variations, both positive and negative, and the effects of these variations on slope stability during the rainfall <span class="hlt">event</span>. Morphometric and geotechnical analyses were carried out for both sites through a series of in-situ and laboratory tests, the results of which were used as input for the modelling process. In a first step the surface infiltration rate was simulated using a modified Chu (1978) approach for the Green and Ampt (1911) equations in case of unsteady rainfall together with a surficial water balance. A finite element seepage <span class="hlt">analysis</span> for transient conditions was then employed to model the changes in pore water pressure during the <span class="hlt">event</span>, using the computed infiltration rate as the ground surface boundary condition. Finally, once again using the data from the previous step as input, the limit equilibrium Morgenstern-Price (1965) slope stability method was applied to calculate the variations in the factor of safety during the <span class="hlt">event</span> and thereby determine the most critical time of instability. In both sites this method produced a curve for the factor of safety that indicated that the most critical time for failure occurred a few hours after the peak of rainfall.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4316557','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4316557"><span id="translatedtitle">The impact of economic austerity and prosperity <span class="hlt">events</span> on suicide in Greece: a 30-year interrupted time-series <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Branas, Charles C; Kastanaki, Anastasia E; Michalodimitrakis, Manolis; Tzougas, John; Kranioti, Elena F; Theodorakis, Pavlos N; Carr, Brendan G; Wiebe, Douglas J</p> <p>2015-01-01</p> <p>Objectives To complete a 30-year interrupted time-series <span class="hlt">analysis</span> of the impact of austerity-related and prosperity-related <span class="hlt">events</span> on the occurrence of suicide across Greece. Setting Greece from 1 January 1983 to 31 December 2012. Participants A total of 11 505 suicides, 9079 by men and 2426 by women, occurring in Greece over the study period. Primary and secondary outcomes National data from the Hellenic Statistical Authority assembled as 360 monthly counts of: all suicides, male suicides, female suicides and all suicides plus potentially misclassified suicides. Results In 30 years, the highest months of suicide in Greece occurred in 2012. The passage of new austerity measures in June 2011 marked the beginning of significant, abrupt and sustained increases in total suicides (+35.7%, p<0.001) and male suicides (+18.5%, p<0.01). Sensitivity analyses that figured in undercounting of suicides also found a significant, abrupt and sustained increase in June 2011 (+20.5%, p<0.001). Suicides by men in Greece also underwent a significant, abrupt and sustained increase in October 2008 when the Greek recession began (+13.1%, p<0.01), and an abrupt but temporary increase in April 2012 following a public suicide committed in response to austerity conditions (+29.7%, p<0.05). Suicides by women in Greece also underwent an abrupt and sustained increase in May 2011 following austerity-related <span class="hlt">events</span> (+35.8%, p<0.05). One prosperity-related <span class="hlt">event</span>, the January 2002 launch of the Euro in Greece, marked an abrupt but temporary decrease in male suicides (−27.1%, p<0.05). Conclusions This is the first multidecade, national <span class="hlt">analysis</span> of suicide in Greece using monthly data. Select austerity-related <span class="hlt">events</span> in Greece corresponded to statistically significant increases for suicides overall, as well as for suicides among men and women. The consideration of future austerity measures should give greater weight to the unintended mental health consequences that may follow and the public</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1811488R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1811488R"><span id="translatedtitle">Numerical Simulation and <span class="hlt">Analysis</span> of the Localized Heavy Precipitation <span class="hlt">Event</span> in South Korea based on diagnostic variables</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roh, Joon-Woo; Choi, Young-Jean</p> <p>2016-04-01</p> <p>Accurate prediction of precipitation is one of the most difficult and significant tasks in weather forecasting. Heavy precipitations in the Korean Peninsula are caused by various physical mechanisms, which are affected by shortwave trough, quasi-stationary moisture convergence zone among varying air masses, and a direct/indirect effect of tropical cyclone. Many previous studies have used observations, numerical modeling, and statistics to investigate the potential causes of warm-season heavy precipitation in South Korea. Especially, the frequency of warm-season torrential rainfall <span class="hlt">events</span> more than 30 mm/h precipitation has increased threefold in Seoul, a metropolitan city in South Korea, in recent 30 years. Localized heavy rainfall <span class="hlt">events</span> in South Korea generally arise from mesoscale convective systems embedded in these synoptic scale disturbances along the Changma front, or from convective instabilities resulting from unstable air masses. In order to investigate localized heavy precipitation system in Seoul metropolitan area, <span class="hlt">analysis</span> and numerical experiment were performed for a typical <span class="hlt">event</span> in 20 June 2014. This case is described to a structure of baroclinic instability associated with a short-wave trough from the northwest and high moist and warm air by a thermal low from the southwest of the Korean Peninsula. We investigated localized heavy precipitation in narrow zone of the Seoul urban area using numerical simulations based on the Weather Research and Forecast (WRF) model with convective scale. The topography and land use data of the revised U.S. Geological Survey (USGS) data and the appropriate set of physical scheme options for WRF model simulation were deliberated. Simulation experiments showed patches of primary physical structures related to the localized heavy precipitation using the diagnostic fields, which are storm relative helicity (SRH), updraft helicity (UH), and instantaneous contraction rates (ICON). SRH and UH are dominantly related to</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3419745','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3419745"><span id="translatedtitle">Comparative <span class="hlt">Analysis</span> of Three Brevetoxin-Associated Bottlenose Dolphin (Tursiops truncatus) Mortality <span class="hlt">Events</span> in the Florida Panhandle Region (USA)</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Twiner, Michael J.; Flewelling, Leanne J.; Fire, Spencer E.; Bowen-Stevens, Sabrina R.; Gaydos, Joseph K.; Johnson, Christine K.; Landsberg, Jan H.; Leighfield, Tod A.; Mase-Guthrie, Blair; Schwacke, Lori; Van Dolah, Frances M.; Wang, Zhihong; Rowles, Teresa K.</p> <p>2012-01-01</p> <p>In the Florida Panhandle region, bottlenose dolphins (Tursiops truncatus) have been highly susceptible to large-scale unusual mortality <span class="hlt">events</span> (UMEs) that may have been the result of exposure to blooms of the dinoflagellate Karenia brevis and its neurotoxin, brevetoxin (PbTx). Between 1999 and 2006, three bottlenose dolphin UMEs occurred in the Florida Panhandle region. The primary objective of this study was to determine if these mortality <span class="hlt">events</span> were due to brevetoxicosis. <span class="hlt">Analysis</span> of over 850 samples from 105 bottlenose dolphins and associated prey items were analyzed for algal toxins and have provided details on tissue distribution, pathways of trophic transfer, and spatial-temporal trends for each mortality <span class="hlt">event</span>. In 1999/2000, 152 dolphins died following extensive K. brevis blooms and brevetoxin was detected in 52% of animals tested at concentrations up to 500 ng/g. In 2004, 105 bottlenose dolphins died in the absence of an identifiable K. brevis bloom; however, 100% of the tested animals were positive for brevetoxin at concentrations up to 29,126 ng/mL. Dolphin stomach contents frequently consisted of brevetoxin-contaminated menhaden. In addition, another potentially toxigenic algal species, Pseudo-nitzschia, was present and low levels of the neurotoxin domoic acid (DA) were detected in nearly all tested animals (89%). In 2005/2006, 90 bottlenose dolphins died that were initially coincident with high densities of K. brevis. Most (93%) of the tested animals were positive for brevetoxin at concentrations up to 2,724 ng/mL. No DA was detected in these animals despite the presence of an intense DA-producing Pseudo-nitzschia bloom. In contrast to the absence or very low levels of brevetoxins measured in live dolphins, and those stranding in the absence of a K. brevis bloom, these data, taken together with the absence of any other obvious pathology, provide strong evidence that brevetoxin was the causative agent involved in these bottlenose dolphin mortality</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3718719','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3718719"><span id="translatedtitle">Artesunate versus quinine in the treatment of severe imported malaria: comparative <span class="hlt">analysis</span> of adverse <span class="hlt">events</span> focussing on delayed haemolysis</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2013-01-01</p> <p>Background Severe malaria is a potentially life-threatening infectious disease. It has been conclusively shown that artesunate compared to quinine is superior in antiparasitic efficacy and in lowering mortality showing a better short-term safety profile. Regarding longer-term effects, reports of delayed haemolysis after parenteral artesunate for severe malaria in returning travellers have been published recently. So far, delayed haemolysis has not been described after the use of parenteral quinine. Methods In this retrospective study, all patients treated for severe malaria at the University Medical Centre Hamburg-Eppendorf were included between 2006 and 2012. The primary endpoint was the proportion of delayed haemolysis in patients treated with quinine versus those who received artesunate. As secondary endpoint, the proportion of any adverse <span class="hlt">event</span> was assessed. Results A total of 36 patients with severe malaria were included in the <span class="hlt">analysis</span>. Of these, 16 patients contributed sufficient data to assess the endpoint delayed haemolysis. Twelve were treated primarily with intravenous quinine – with four patients having received intrarectal artesunate as an adjunct treatment – and five patients were treated primarily with artesunate. Five cases of delayed haemolysis could be detected – two in patients treated with quinine and intrarectal artesunate and three in patients treated with artesunate. No case of delayed haemolysis was detected in patients treated with quinine alone. While adverse <span class="hlt">events</span> observed in patients treated with artesunate were limited to delayed haemolysis (three patients, 60%) and temporary deterioration in renal function (three patients, 60%), patients treated with quinine showed a more diverse picture of side effects with 22 patients (71%) experiencing at least one adverse <span class="hlt">event</span>. The most common adverse <span class="hlt">events</span> after quinine were hearing disturbances (12 patients, 37%), hypoglycaemia (10 patients, 32%) and cardiotoxicity (three patients, 14</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24748687','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24748687"><span id="translatedtitle">Single <span class="hlt">event</span> time series <span class="hlt">analysis</span> in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mayaud, C; Wagner, T; Benischke, R; Birk, S</p> <p>2014-04-16</p> <p>The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two <span class="hlt">events</span> recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series <span class="hlt">analysis</span> (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single <span class="hlt">events</span>. In order to support the interpretation of the results from the time series <span class="hlt">analysis</span> a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series <span class="hlt">analysis</span>. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series <span class="hlt">analysis</span>. Comparing the results from the time series <span class="hlt">analysis</span> of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3990444','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3990444"><span id="translatedtitle">Single <span class="hlt">event</span> time series <span class="hlt">analysis</span> in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.</p> <p>2014-01-01</p> <p>Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two <span class="hlt">events</span> recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series <span class="hlt">analysis</span> (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single <span class="hlt">events</span>. In order to support the interpretation of the results from the time series <span class="hlt">analysis</span> a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series <span class="hlt">analysis</span>. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series <span class="hlt">analysis</span>. Comparing the results from the time series <span class="hlt">analysis</span> of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.4999M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.4999M"><span id="translatedtitle"><span class="hlt">Analysis</span> of the slow slip <span class="hlt">events</span> of Guerrero, Mexico: implications for numerical modeling.</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maury, Julie; Aochi, Hideo; Radiguet, Mathilde</p> <p>2014-05-01</p> <p>Guerrero, in Mexico, is one of the subduction zones where long term slow slip <span class="hlt">events</span> (SSEs) have been observed recurrently. Understanding the mechanics of these <span class="hlt">events</span> is important to determine their role in the seismic cycle. SSEs in Guerrero have been found to have the same characteristics, along the interface of subduction, as classical earthquakes but with much longer slip time (around a year) and lower stress drop (0.1 MPa). We investigate the slip models of the Guerrero SSEs of 2006 and 2009 (Radiguet et al., JGR 2012). The kinematic slip models have been determined by inversion of GPS time series using two different methods. From these slip histories, the constitutive relation between stress and slip (or slip rate) on each subfault is determined. Analytical Green functions are used to calculate the shear stress in a homogeneous, elastic, isotropic medium. Whatever the kinematic slip modeling method used, a clear slip weakening law can be retrieved over the whole slipping area. While some spatial variation in the parameters of the slip weakening law is observed, a mean value of about 0.1 m for the slip weakening distance and 2.5 kJ/m2 for the fracture energy can be extracted on each subfault. Moreover the slip-weakening rate seems quite homogeneous (around 1 MPa/m), and this is roughly the same as the value found in coseismic processes. The yield stress is of the order of 0.01 MPa, a low value compared to a stress drop of 0.1 MPa. The stress-slip rate relationship presents a loop trajectory coherent with the one observed in classical earthquakes. The results of these analyses are used to numerically model the Guerrero SSEs. The aim is to reproduce the slip pattern using the mechanical laws determined in the study of the slip model. If a simple slip weakening law, with parameters found above, is used, we observe a rapid progress of the crack-like slip area. This is different from the observation of the migration of localized slip. So a slowing mechanism</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1710392V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1710392V"><span id="translatedtitle"><span class="hlt">Analysis</span> of a creeping marls <span class="hlt">event</span> in the coastal cliffs of Bessin, Basse-Normandie, France</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vioget, Alizée; Michoud, Clément; Jaboyedoff, Michel; Maquaire, Olivier; Costa, Stéphane; Davidson, Robert; Derron, Marc-Henri</p> <p>2015-04-01</p> <p>The cliffs' retreat is a major issue for the management of coastal territories. Two coastal areas in "Calvados" and "Pays de Caux", French Normandy, are studied. The Bessin cliff is about 4.3 km long and lies between the World War II artillery batteries of Longues-sur-Mer and Arromanches-les-Bains. On the coastline, the cliff's height varies between 10 and 75 meters above sea level. The site's lithology is mainly composed by two formations: the Bessin limestones lie on top of the Port marls, which act as an aquitard. More or less important water outflows are therefore observed at the contact between the marls and the limestone. For this communication, we aim to focus on a complex landslide that happened in May 2013 near Cape Manvieux, estimating volumes and modelling the landslide kinematics. For that purpose, some field observations and measurement have been made in order to make a realistic profile and to understand the steps which lead to this 27 m high and 110 m wide <span class="hlt">event</span>. In addition, a terrestrial LiDAR (Optech Ilris3D) acquisition of the instability was performed in July 2013 and is compared with the Litto3D (the continued DEM over land and see) acquired in 2011 by the IGN. This comparison shows a maximum cliffs' retreat of about 27 m and 30'000 m3 and a deposit accumulation of about 8 m height. In addition, a limestone rock column of 2'000 m3 and 18 m height within the toppled deposits could still collapse in a short time. Up to now, these site-specific investigations, set in the context of instabilities within the entire study area, let us suppose that the current state of the instability was created by multiple successive <span class="hlt">events</span>. The landslide could hence be caused by a complex mix of creeping marls conditioned by its water content and pressure induced by overlying formations and toppling of limestone destabilised by the formation of back subvertical crack due to limestone exhumation debuttressing.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AtmEn..78..219A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AtmEn..78..219A"><span id="translatedtitle"><span class="hlt">Analysis</span> of source regions for smoke <span class="hlt">events</span> in Singapore for the 2009 El Nino burning season</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Atwood, Samuel A.; Reid, Jeffrey S.; Kreidenweis, Sonia M.; Yu, Liya E.; Salinas, Santo V.; Chew, Boon Ning; Balasubramanian, Rajasekhar</p> <p>2013-10-01</p> <p>As part of the 7 SouthEast Asian Studies (7SEAS) program, a solar radiation and chemistry sampling site (“supersite”) was developed at the National University of Singapore (NUS) to monitor regional air quality. The first intensive operations period for this site occurred between August and October 2009, a period that coincided with a moderate El Nino <span class="hlt">event</span> and enhanced tropical burning, particularly in peatlands. We use data from this period to analyze the transport of biomass burning emissions in the Maritime Continent (MC) to the NUS supersite. An overview of the aerosol environment is provided for Singapore, followed by more detailed discussion of four aerosol <span class="hlt">events</span>. The 2009 burning season was similar to those described in previous analyses, which showed that fire activity begins in the western half of the MC in Sumatra and propagates eastward in time. Similarly, agricultural burning occurs first, generally followed by deforestation and peatland fires. Some of the biomass burning emissions make their way into the free troposphere, where they are transported regionally by the prevailing wind patterns. Our analyses show that the seasonal winds at 850 hPa (˜1500 m) shift transport patterns from source regions to the southwest of Singapore, to regions to the southeast over the course of the summer monsoon, patterns that allow Singapore to be impacted by peak burning regions in the MC. In contrast, winds at the surface are more typically from the south and southeast, demonstrating the prevalence of vertical wind shear over the region. As a result of the variable source regions influencing different levels of the atmosphere over Singapore, in-situ surface observations of aerosol mass concentrations are not always consistent with inferences of the presence of enhanced aerosol concentration from column optical depth. Our findings confirm the complexity of aerosol sources and transport over the MC, and the key role that biomass burning emissions play in</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4767362','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4767362"><span id="translatedtitle">In situ <span class="hlt">analysis</span> of intrahepatic virological <span class="hlt">events</span> in chronic hepatitis B virus infection</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zhang, Xiaonan; Lu, Wei; Zheng, Ye; Wang, Weixia; Bai, Lu; Chen, Liang; Feng, Yanling; Zhang, Zhanqing</p> <p>2016-01-01</p> <p>Persistent hepatitis B virus (HBV) infection is established by the formation of an intranuclear pool of covalently closed circular DNA (cccDNA) in the liver. Very little is known about the intrahepatic distribution of HBV cccDNA in infected patients, particularly at the single-cell level. Here, we established a highly sensitive and specific ISH assay for the detection of HBV RNA, DNA, and cccDNA. The specificity of our cccDNA probe set was confirmed by its strict intranuclear signal and by a series of Southern blot analyses. Use of our in situ assay in conjunction with IHC or immunofluorescence uncovered a surprisingly mosaic distribution of viral antigens and nucleic acids. Most strikingly, a mutually exclusive pattern was found between HBV surface antigen–positive (HBsA-positive) and HBV DNA– and cccDNA-positive cells. A longitudinal observation of patients over a 1-year period of adeforvir therapy confirmed the persistence of a nuclear reservoir of viral DNA, although cytoplasmic DNA was effectively depleted in these individuals. In conclusion, our method for detecting viral nucleic acids, including cccDNA, with single-cell resolution provides a means for monitoring intrahepatic virological <span class="hlt">events</span> in chronic HBV infection. More important, our observations unravel the complexity of the HBV life cycle in vivo. PMID:26901811</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/26062134','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/26062134"><span id="translatedtitle">Political Imprisonment and Adult Functioning: A Life <span class="hlt">Event</span> History <span class="hlt">Analysis</span> of Palestinians.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>McNeely, Clea; Barber, Brian K; Spellings, Carolyn; Belli, Robert; Giacaman, Rita; Arafat, Cairo; Daher, Mahmoud; El Sarraj, Eyad; Mallouh, Mohammed Abu</p> <p>2015-06-01</p> <p>Political imprisonment is a traumatic <span class="hlt">event</span>, often accompanied by torture and deprivation. This study explores the association of political imprisonment between 1987 and 2011 with political, economic, community, psychological, physical, and family functioning in a population-based sample of Palestinian men ages 32-43 years (N = 884) derived from a dataset collected in 2011. Twenty-six percent (n = 233) had been politically imprisoned. Men imprisoned between 1987 and 2005 reported functioning as well as never-imprisoned men in most domains, suggesting that men imprisoned as youth have moved forward with their lives in ways similar to their nonimprisoned counterparts. In an exception to this pattern, men imprisoned during the Oslo Accords period (1994-1999) reported higher levels of trauma-related stress (B = 0.24, p = .027) compared to never-imprisoned men. Men imprisoned since 2006 reported lower functioning in multiple domains: human insecurity (B = 0.33, p = .023), freedom of public expression (B = -0.48, p = .017), perceived government stability (B = -0.38, p = .009), feeling broken or destroyed (B = 0.59, p = .001), physical limitations (B = 0.55, p = .002), and community belonging (B = -0.33, p = .048). Findings pointed to the value of examining the effects of imprisonment on functioning in multiple domains. PMID:26062134</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23761353','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23761353"><span id="translatedtitle">Metagenomic <span class="hlt">analysis</span> of the coral holobiont during a natural bleaching <span class="hlt">event</span> on the Great Barrier Reef.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Littman, Raechel; Willis, Bette L; Bourne, David G</p> <p>2011-12-01</p> <p>Understanding the effects of elevated seawater temperatures on each member of the coral holobiont (the complex comprised of coral polyps and associated symbiotic microorganisms, including Bacteria, viruses, Fungi, Archaea and endolithic algae) is becoming increasingly important as evidence accumulates that microbial members contribute to overall coral health, particularly during thermal stress. Here we use a metagenomic approach to identify metabolic and taxonomic shifts in microbial communities associated with the hard coral Acropora millepora throughout a natural thermal bleaching <span class="hlt">event</span> at Magnetic Island (Great Barrier Reef). A direct comparison of metagenomic data sets from healthy versus bleached corals indicated major shifts in microbial associates during heat stress, including Bacteria, Archaea, viruses, Fungi and micro-algae. Overall, metabolism of the microbial community shifted from autotrophy to heterotrophy, including increases in genes associated with the metabolism of fatty acids, proteins, simple carbohydrates, phosphorus and sulfur. In addition, the proportion of virulence genes was higher in the bleached library, indicating an increase in microorganisms capable of pathogenesis following bleaching. These results demonstrate that thermal stress results in shifts in coral-associated microbial communities that may lead to deteriorating coral health.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/25646012','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/25646012"><span id="translatedtitle">Specific deterrence, community context, and drunk driving: an <span class="hlt">event</span> history <span class="hlt">analysis</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lee, Chang-Bae; Teske, Raymond H C</p> <p>2015-03-01</p> <p>Previous studies about recidivism of offenders have focused primarily on the nature of the sanctions and factors specific to the individual offender. This study addressed both individual and community factors, using a cohort of felony-level, driving while intoxicated (DWI) probationers (N = 370) charged in Harris County, Texas. The study investigated specific deterrent effects of sanctions on success or failure of probationers while controlling for the community contexts to observe how informal social control processes contextualize individual-level predictors. Results of a series of <span class="hlt">event</span> history analyses tracking probationers for a period of 8 years indicated that severity of punishment, swiftness of punishment, criminal history, and completion of DWI education programs significantly affected the probationer's survival time, whereas no significant influence of community contexts on survival time or success was observed. Reducing the felony charge to a misdemeanor, a shorter period of probation, and past criminal history, combined with an almost immediate guilty plea, were significantly associated with short-term failure on probation. PMID:25646012</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NIMPA.834..118W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NIMPA.834..118W"><span id="translatedtitle">Image <span class="hlt">analysis</span> of single <span class="hlt">event</span> transient effects on charge coupled devices irradiated by protons</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Zujun; Xue, Yuanyuan; Liu, Jing; He, Baoping; Yao, Zhibin; Ma, Wuying</p> <p>2016-10-01</p> <p>The experiments of single <span class="hlt">event</span> transient (SET) effects on charge coupled devices (CCDs) irradiated by protons are presented. The radiation experiments have been carried out at the accelerator protons with the energy of 200 MeV and 60 MeV.The incident angles of the protons are at 30°and 90° to the plane of the CCDs to obtain the images induced by the perpendicularity and incline incident angles. The experimental results show that the typical characteristics of the SET effects on a CCD induced by protons are the generation of a large number of dark signal spikes (hot pixels) which are randomly distributed in the "pepper" images. The characteristics of SET effects are investigated by observing the same imaging area at different time during proton radiation to verify the transient effects. The experiment results also show that the number of dark signal spikes increases with increasing integration time during proton radiation. The CCDs were tested at on-line and off-line to distinguish the radiation damage induced by the SET effects or DD effects. The mechanisms of the dark signal spike generation induced by the SET effects and the DD effects are demonstrated respectively.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1992JApMe..31..849M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1992JApMe..31..849M"><span id="translatedtitle">Synoptic <span class="hlt">Analysis</span> of the GUFMEX Return-Flow <span class="hlt">Event</span> of 10-12 March 1988.</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Merrill, Robert T.</p> <p>1992-08-01</p> <p>Return flow is the moist southerly wind that develops over the Gulf of Mexico after an outbreak of polar air. Surface, aircraft, and special rawinsonde data collected during the Gulf of Mexico Experiment (GUFMEX) are used to describe the return-flow <span class="hlt">event</span> of 10-12 March 1988. The return flow at the surface contained both modified polar air and prefrontal air. The surface moist layer was capped by a stable subsident layer except over its western extremity, where an elevated mixed layer was observed. At later stages, a moist layer aloft was present as well.These complex airmass structures arose because both advective and diabatic processes are significant in a return flow. The roles of each are inferred qualitatively by comparing the observed mixing-ratio distribution to the equilibrium conditions expected for the observed sea surface temperatures. The surface moisture distribution can be explained by rapid modification of offshore flow to near equilibrium, followed by onshore (return) flow of the modified air with little additional change. The structure above the surface moist layer is explained by differential advection that juxtaposed three different airstreams. Though no significant severe weather followed this particular case, the processes that typically lead to a favorable severe-weather environment are evident.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22667562','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22667562"><span id="translatedtitle">Hydrogen scrambling in ethane induced by intense laser fields: statistical <span class="hlt">analysis</span> of coincidence <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kanya, Reika; Kudou, Tatsuya; Schirmel, Nora; Miura, Shun; Weitzel, Karl-Michael; Hoshina, Kennosuke; Yamanouchi, Kaoru</p> <p>2012-05-28</p> <p>Two-body Coulomb explosion processes of ethane (CH(3)CH(3)) and its isotopomers (CD(3)CD(3) and CH(3)CD(3)) induced by an intense laser field (800 nm, 1.0 × 10(14) W/cm(2)) with three different pulse durations (40 fs, 80 fs, and 120 fs) are investigated by a coincidence momentum imaging method. On the basis of statistical treatment of the coincidence data, the contributions from false coincidence <span class="hlt">events</span> are estimated and the relative yields of the decomposition pathways are determined with sufficiently small uncertainties. The branching ratios of the two body decomposition pathways of CH(3)CD(3) from which triatomic hydrogen molecular ions (H(3)(+), H(2)D(+), HD(2)(+), D(3)(+)) are ejected show that protons and deuterons within CH(3)CD(3) are scrambled almost statistically prior to the ejection of a triatomic hydrogen molecular ion. The branching ratios were estimated by statistical Rice-Ramsperger-Kassel-Marcus calculations by assuming a transition state with a hindered-rotation of a diatomic hydrogen moiety. The hydrogen scrambling dynamics followed by the two body decomposition processes are discussed also by using the anisotropies in the ejection directions of the fragment ions and the kinetic energy distribution of the two body decomposition pathways.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/26901811','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/26901811"><span id="translatedtitle">In situ <span class="hlt">analysis</span> of intrahepatic virological <span class="hlt">events</span> in chronic hepatitis B virus infection.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Xiaonan; Lu, Wei; Zheng, Ye; Wang, Weixia; Bai, Lu; Chen, Liang; Feng, Yanling; Zhang, Zhanqing; Yuan, Zhenghong</p> <p>2016-03-01</p> <p>Persistent hepatitis B virus (HBV) infection is established by the formation of an intranuclear pool of covalently closed circular DNA (cccDNA) in the liver. Very little is known about the intrahepatic distribution of HBV cccDNA in infected patients, particularly at the single-cell level. Here, we established a highly sensitive and specific ISH assay for the detection of HBV RNA, DNA, and cccDNA. The specificity of our cccDNA probe set was confirmed by its strict intranuclear signal and by a series of Southern blot analyses. Use of our in situ assay in conjunction with IHC or immunofluorescence uncovered a surprisingly mosaic distribution of viral antigens and nucleic acids. Most strikingly, a mutually exclusive pattern was found between HBV surface antigen-positive (HBsA-positive) and HBV DNA- and cccDNA-positive cells. A longitudinal observation of patients over a 1-year period of adeforvir therapy confirmed the persistence of a nuclear reservoir of viral DNA, although cytoplasmic DNA was effectively depleted in these individuals. In conclusion, our method for detecting viral nucleic acids, including cccDNA, with single-cell resolution provides a means for monitoring intrahepatic virological <span class="hlt">events</span> in chronic HBV infection. More important, our observations unravel the complexity of the HBV life cycle in vivo. PMID:26901811</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.608a2036A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.608a2036A"><span id="translatedtitle">STAR Online Framework: from Metadata Collection to <span class="hlt">Event</span> <span class="hlt">Analysis</span> and System Control</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Arkhipkin, D.; Lauret, J.</p> <p>2015-05-01</p> <p>In preparation for the new era of RHIC running (RHIC-II upgrades and possibly, the eRHIC era), the STAR experiment is expanding its modular Message Interface and Reliable Architecture framework (MIRA). MIRA allowed STAR to integrate meta-data collection, monitoring, and online QA components in a very agile and efficient manner using a messaging infrastructure approach. In this paper, we briefly summarize our past achievements, provide an overview of the recent development activities focused on messaging patterns and describe our experience with the complex <span class="hlt">event</span> processor (CEP) recently integrated into the MIRA framework. CEP was used in the recent RHIC Run 14, which provided practical use cases. Finally, we present our requirements and expectations for the planned expansion of our systems, which will allow our framework to acquire features typically associated with Detector Control Systems. Special attention is given to aspects related to latency, scalability and interoperability within heterogeneous set of services, various data and meta-data acquisition components coexisting in STAR online domain.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/23761353','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/23761353"><span id="translatedtitle">Metagenomic <span class="hlt">analysis</span> of the coral holobiont during a natural bleaching <span class="hlt">event</span> on the Great Barrier Reef.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Littman, Raechel; Willis, Bette L; Bourne, David G</p> <p>2011-12-01</p> <p>Understanding the effects of elevated seawater temperatures on each member of the coral holobiont (the complex comprised of coral polyps and associated symbiotic microorganisms, including Bacteria, viruses, Fungi, Archaea and endolithic algae) is becoming increasingly important as evidence accumulates that microbial members contribute to overall coral health, particularly during thermal stress. Here we use a metagenomic approach to identify metabolic and taxonomic shifts in microbial communities associated with the hard coral Acropora millepora throughout a natural thermal bleaching <span class="hlt">event</span> at Magnetic Island (Great Barrier Reef). A direct comparison of metagenomic data sets from healthy versus bleached corals indicated major shifts in microbial associates during heat stress, including Bacteria, Archaea, viruses, Fungi and micro-algae. Overall, metabolism of the microbial community shifted from autotrophy to heterotrophy, including increases in genes associated with the metabolism of fatty acids, proteins, simple carbohydrates, phosphorus and sulfur. In addition, the proportion of virulence genes was higher in the bleached library, indicating an increase in microorganisms capable of pathogenesis following bleaching. These results demonstrate that thermal stress results in shifts in coral-associated microbial communities that may lead to deteriorating coral health. PMID:23761353</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12214359','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12214359"><span id="translatedtitle">Crash reconstruction and injury-mechanism <span class="hlt">analysis</span> using <span class="hlt">event</span> data recorder technology.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Donnelly, B R; Galganski, R A; Lawrence, R D; Blatt, A</p> <p>2001-01-01</p> <p>Sophisticated onboard crash-<span class="hlt">event</span> data recorders (EDRs) that log key vehicle dynamics information can be used to improve crash reconstruction, model occupant response, study the mechanisms of injury, and estimate occupant injury probabilities in near-real time. Such an EDR was developed and utilized as part of the Automatic Collision Notification (ACN) system for the National Highway Traffic Safety Administration. This paper presents the results of a study in which the reconstruction of an actual crash was augmented using EDR/ACN-supplied three-dimensional acceleration and other data in a vehicle occupant model configured using the Articulated Total Body (ATB) computer code. ATB-generated occupant-motion imagery and body-region acceleration response information provided valuable insights that permitted crash-reconstruction specialists to ascertain the true nature of the collision and identify the probable cause of an injury suffered by one of the victims. The authors also posit that the use of EDR data from an ACN-type system as inputs to occupant crash-response modeling may be someday support crash-victim emergency medical treatment and triage.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160006409','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160006409"><span id="translatedtitle">A Simple Engineering <span class="hlt">Analysis</span> of Solar Particle <span class="hlt">Event</span> High Energy Tails and Their Impact on Vehicle Design</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Singleterry, Robert C., Jr.; Walker, Steven A.; Clowdsley, Martha S.</p> <p>2016-01-01</p> <p>The mathematical models for Solar Particle <span class="hlt">Event</span> (SPE) high energy tails are constructed with several di erent algorithms. Since limited measured data exist above energies around 400 MeV, this paper arbitrarily de nes the high energy tail as any proton with an energy above 400 MeV. In order to better understand the importance of accurately modeling the high energy tail for SPE spectra, the contribution to astronaut whole body e ective dose equivalent of the high energy portions of three di erent SPE models has been evaluated. To ensure completeness of this <span class="hlt">analysis</span>, simple and complex geometries were used. This <span class="hlt">analysis</span> showed that the high energy tail of certain SPEs can be relevant to astronaut exposure and hence safety. Therefore, models of high energy tails for SPEs should be well analyzed and based on data if possible.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5046117','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5046117"><span id="translatedtitle">Statistical and Ontological <span class="hlt">Analysis</span> of Adverse <span class="hlt">Events</span> Associated with Monovalent and Combination Vaccines against Hepatitis A and B Diseases</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Xie, Jiangan; Zhao, Lili; Zhou, Shangbo; He, Yongqun</p> <p>2016-01-01</p> <p>Vaccinations often induce various adverse <span class="hlt">events</span> (AEs), and sometimes serious AEs (SAEs). While many vaccines are used in combination, the effects of vaccine-vaccine interactions (VVIs) on vaccine AEs are rarely studied. In this study, AE profiles induced by hepatitis A vaccine (Havrix), hepatitis B vaccine (Engerix-B), and hepatitis A and B combination vaccine (Twinrix) were studied using the VAERS data. From May 2001 to January 2015, VAERS recorded 941, 3,885, and 1,624 AE case reports where patients aged at least 18 years old were vaccinated with only Havrix, Engerix-B, and Twinrix, respectively. Using these data, our statistical <span class="hlt">analysis</span> identified 46, 69, and 82 AEs significantly associated with Havrix, Engerix-B, and Twinrix, respectively. Based on the Ontology of Adverse <span class="hlt">Events</span> (OAE) hierarchical classification, these AEs were enriched in the AEs related to behavioral and neurological conditions, immune system, and investigation results. Twenty-nine AEs were classified as SAEs and mainly related to immune conditions. Using a logistic regression model accompanied with MCMC sampling, 13 AEs (e.g., hepatosplenomegaly) were identified to result from VVI synergistic effects. Classifications of these 13 AEs using OAE and MedDRA hierarchies confirmed the advantages of the OAE-based method over MedDRA in AE term hierarchical <span class="hlt">analysis</span>. PMID:27694888</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010ems..confE.689M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010ems..confE.689M"><span id="translatedtitle">Synoptic-mesoscale <span class="hlt">analysis</span> and numerical modeling of a tornado <span class="hlt">event</span> on 12 February 2010 in northern Greece</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Matsangouras, J. T.; Nastos, P. T.; Pytharoulis, I.</p> <p>2010-09-01</p> <p>Tornadoes are furious convective weather phenomena, having increased frequency, particularly in the cool season, attributed to the higher moisture content of the atmosphere due to global warming. Tornadoes' source regions are more likely shallow waters, which are easily warmed, such as Gulf of Mexico or Mediterranean Sea. This study analyzes the tornado <span class="hlt">event</span>, that occurred on 12 February 2010 in Vrastera, Chalkidiki, a non urban area 45 km southeastern of Thessaloniki in northern Greece. The tornado was developed approximately between 17:15 and 18:45 UTC and characterized as F2 (Fujita Scale). The tornado caused several damages to an industrial building and an olive-tree farm. A synoptic <span class="hlt">analysis</span> based on the ECMWF charts is presented along with an extended dataset of satellite images, radar products and vertical profile of the atmosphere. Additionaly, the nonhydrostatic WRF-ARW atmospheric numerical model (version 3.2) was utilized in <span class="hlt">analysis</span> and forecast mode using very high horizontal resolution (1 km x 1 km) in order to represent the ambient atmospheric conditions and examine the prediction of the <span class="hlt">event</span>. Sensitivity experiments look into the model performance in the choice of microphysical and boundary layer parameterization schemes.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=networks&pg=3&id=EJ1006498','ERIC'); return false;" href="http://eric.ed.gov/?q=networks&pg=3&id=EJ1006498"><span id="translatedtitle">Social Network Changes and Life <span class="hlt">Events</span> across the Life Span: A Meta-<span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Wrzus, Cornelia; Hanel, Martha; Wagner, Jenny; Neyer, Franz J.</p> <p>2013-01-01</p> <p>For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-<span class="hlt">analysis</span> on age-related social network…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=leiter&pg=7&id=EJ551434','ERIC'); return false;" href="http://eric.ed.gov/?q=leiter&pg=7&id=EJ551434"><span id="translatedtitle">Child Maltreatment and School Performance Declines: An <span class="hlt">Event</span>-History <span class="hlt">Analysis</span>.</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Leiter, Jeffrey; Johnsen, Matthew C.</p> <p>1997-01-01</p> <p>Presents a longitudinal <span class="hlt">analysis</span> of school performance declines among neglected and abused children, using the maltreatment and school histories of 1,369 children in Mecklenburg County, North Carolina. Significant relationships between maltreatment and declines in performance were found in diverse school outcomes. (SLD)</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24037620','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24037620"><span id="translatedtitle">Cure models for the <span class="hlt">analysis</span> of time-to-<span class="hlt">event</span> data in cancer studies.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Jia, Xiaoyu; Sima, Camelia S; Brennan, Murray F; Panageas, Katherine S</p> <p>2013-11-01</p> <p>In settings when it is biologically plausible that some patients are cured after definitive treatment, cure models present an alternative to conventional survival <span class="hlt">analysis</span>. Cure models can inform on the group of patients cured, by estimating the probability of cure, and identifying factors that influence it; while simultaneously focusing on time to recurrence and associated factors for the remaining patients.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://eric.ed.gov/?q=cheating&pg=6&id=EJ826615','ERIC'); return false;" href="http://eric.ed.gov/?q=cheating&pg=6&id=EJ826615"><span id="translatedtitle">A Statistical <span class="hlt">Analysis</span> of Infrequent <span class="hlt">Events</span> on Multiple-Choice Tests that Indicate Probable Cheating</span></a></p> <p><a target="_blank" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Sundermann, Michael J.</p> <p>2008-01-01</p> <p>A statistical <span class="hlt">analysis</span> of multiple-choice answers is performed to identify anomalies that can be used as evidence of student cheating. The ratio of exact errors in common (EEIC: two students put the same wrong answer for a question) to differences (D: two students get different answers) was found to be a good indicator of cheating under a wide…</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4590384','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4590384"><span id="translatedtitle">Association of smoking with restenosis and major adverse cardiac <span class="hlt">events</span> after coronary stenting: A meta-<span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hu, Rui-ting; Liu, Jie; Zhou, You; Hu, Bang-li</p> <p>2015-01-01</p> <p>Background and Objective: The association between smoking and clinical outcomes after coronary stenting is controversial. The aim of this meta-<span class="hlt">analysis</span> was to assess the association between smoking and in stent restenosis (ISR), major adverse cardiac <span class="hlt">events</span> (MACE), or major adverse cardiac and cerebrovascular <span class="hlt">events</span> (MACCE) after coronary stenting. Methods: A search for studies published before December 2014 was conducted in PubMed, Embase, and Cochrane library. An inverse random weighted meta-<span class="hlt">analysis</span> was conducted using logarithm of the odds ratio (OR) and its standard error for each study. Results: Ten studies investigated the association between smoking and ISR. Overall, smoking was not associated with ISR (OR: 1.05, 95% CI: 0.79–1.41; I2 = 47.8%). Subgroup <span class="hlt">analysis</span> also failed to show a significant association between smoking and ISR risk regardless of bare metal stent (BMS) and drug-eluting stent (DES) implantation. Eight studies explored the association between smoking and MACE, but no association was found (OR: 0.92, 95% CI: 0.77–1.10; I2 = 25.5%), and subgroup <span class="hlt">analysis</span> revealed that no distinct difference was found between BMS and DES implantation. Three studies investigated the association between smoking and MACCE and significant association was found (OR: 2.09, 95% CI: 1.43–3.06; I2 = 21.6%). Conclusions: Our results suggest that in patients undergoing percutaneous coronary intervention with stent implantation, smoking is not associated with ISR and MACE; however, smoking is an independent risk factor for MACCE. PMID:26430448</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.osti.gov/scitech/servlets/purl/910944','SCIGOV-STC'); return false;" href="http://www.osti.gov/scitech/servlets/purl/910944"><span id="translatedtitle">Preliminary Drop Testing Results to Validate an <span class="hlt">Analysis</span> Methodology for Accidental Drop <span class="hlt">Events</span> of Containers for Radioactive Materials</span></a></p> <p><a target="_blank" href="http://www.osti.gov/scitech">SciTech Connect</a></p> <p>Snow, Spencer David; Morton, Dana Keith; Rahl, Tommy Ervin; Ware, Arthur Gates</p> <p>2001-07-01</p> <p>The National Spent Nuclear Fuel Program, operating from the Idaho National Engineering and Environmental Laboratory (INEEL), developed the standardized Department of Energy (DOE) spent nuclear fuel (SNF) canister. During the development of this canister, more than twenty drop tests were completed, evaluating high strain behavior, puncture resistance, maintenance of containment, and other canister responses. Computer analyses of these drop-test specimens/canisters employed the ABAQUS/Explicit software. A pre-drop <span class="hlt">analysis</span> was performed for each test specimen to predict the deformed shape and resulting material straining. Typically, a postdrop <span class="hlt">analysis</span> was also performed to better match actual test specifics (actual impact angle, test specimen material properties, etc.). The purpose for this <span class="hlt">analysis</span> effort was to determine the capability of current <span class="hlt">analysis</span> techniques to accurately predict the deformed shape of a standardized DOE SNF canister subjected to a defined drop <span class="hlt">event</span>, without actually having to perform a drop test for every drop <span class="hlt">event</span> of interest. Those analytical efforts yielded very accurate predictions for nearly all of the drop tests. However, it was noted, during one small-scale test, that the calculated deformed shape of the test specimen depended on the modeled frictional behavior as it impacted the essentially unyielding flat surface. In order to calculate the correct deformed shape, the modeled frictional behavior had to be changed to an unanticipated value. This paper will report the results of a preliminary investigation that determined the appropriate frictional modeling for a variety of impact angles. That investigation included drop testing performed at the INEEL from September 2000 to January 2001.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5063067','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5063067"><span id="translatedtitle">Enhancing the Effectiveness of Significant <span class="hlt">Event</span> <span class="hlt">Analysis</span>: Exploring Personal Impact and Applying Systems Thinking in Primary Care</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan</p> <p>2016-01-01</p> <p>Introduction: Significant <span class="hlt">event</span> <span class="hlt">analysis</span> (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why <span class="hlt">events</span> happen and formulating improvements. To enhance SEA effectiveness, we developed and tested “guiding tools” based on human factors principles. Methods: Mixed-methods development of guiding tools (Personal Booklet—to help with emotional demands and apply a human factors <span class="hlt">analysis</span> at the individual level; Desk Pad—to guide a team-based systems <span class="hlt">analysis</span>; and a written Report Format) by a multiprofessional “expert” group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic <span class="hlt">analysis</span> of SEA reports. Results: Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P < .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Discussion: Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement. PMID:27583996</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110012451','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110012451"><span id="translatedtitle"><span class="hlt">Analysis</span> of STS-134 Hail <span class="hlt">Event</span> at Pad 39A, March 30, 2011</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lane, John E.</p> <p>2011-01-01</p> <p>During the late afternoon of March 30, 2011 at approximately 21:25 - 21:30 GMT, hail monitor stations at Pad 39A recorded rice to pea size hail. The duration of the <span class="hlt">event</span> was approximately 5 minutes. The maximum size detected by the three hail monitors was 10 - 12 mm. The 12 mm marble size value was measured by the active impact sensor at site #2, which experienced high winds. This 12 mm measurement may be artificially higher by one or two mm due to the extra hail kinetic energy resulting from the extreme horizontal winds. High winds from the west produced a few notable long streak-like dents in the hail pads. High winds were also responsible for damage to facilities near hail monitor site #2 on the west side of pad A (a dumpster was overturned, and a picnic table roof was demolished). NWS radar volume scan (see Figure I) showed 60-65 dBZ reflectivity values in the lowest 4 scan elevations around and over the pad 39A area. Since the lowest 0.5 degree scan showed a definite 65 dBZ signature, it is unlikely that hail had an opportunity to melt before reaching the ground. Some of the larger passive hail pad dents were shallower than what would be expected from solid frozen ice hydrometeor dents. Therefore, it is possible that the larger pea size hail may have been softer than the smaller rice size hail. This would be consistent with some melting before reaching the ground.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4376610','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4376610"><span id="translatedtitle">Dermatologic adverse <span class="hlt">events</span> in pediatric patients receiving targeted anticancer therapies: a pooled <span class="hlt">analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Pratilas, Christine A.; Sibaud, Vincent; Boralevi, Franck; Lacouture, Mario E.</p> <p>2015-01-01</p> <p>BACKGROUND The dermatologic adverse <span class="hlt">events</span> (AEs) of various molecularly targeted therapies are well-described in adult cancer patients. Little has been reported on the incidence and clinical presentation of such AEs in pediatric patients with cancer. To address this gap, we analyzed the dermatologic AEs reported across clinical trials of targeted anticancer therapies in pediatric patients. METHODS We conducted an electronic literature search (PubMed, American Society of Clinical Oncology annual meetings’ abstracts, ClinicalTrials.gov, NCI’s Pediatric Oncology Branch webpage) to identify clinical trials involving targeted anticancer therapies that reported dermatologic AEs in their safety data. Studies were limited to the pediatric population, monotherapy trials (oncology), and English language publications. RESULTS Pooled data from 19 clinical studies investigating 11 targeted anticancer agents (alemtuzumab, rituximab, imatinib, dasatinib, erlotinib, vandetanib, sorafenib, cabozantinib, pazopanib, everolimus, and temsirolimus) were analyzed. The most frequently encountered dermatologic AEs were rash (127/660; 19%), xerosis (18/100; 18%), mucositis (68/402; 17%) and pruritus (12/169; 7%). Other AEs included pigmentary abnormalities of the skin/hair (13%), hair disorders (trichomegaly, hypertrichosis, alopecia and madarosis; 14%), urticaria (7%), palmoplantar erythrodysesthesia (7%), erythema, acne, purpura, skin fissures, other ‘unknown skin changes’, exanthem, infection, flushing, telangiectasia, and photosensitivity. CONCLUSION This study describes the dermatologic manifestations of targeted anticancer therapy-related AEs in the pediatric population. Since these AEs are often associated with significant morbidity, it is imperative that pediatric oncologists be familiar with their recognition and management, to avoid unnecessary dose modifications and/or termination, and to prevent impairments in patients’ quality of life. PMID:25683226</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3563043','PMC'); return false;" href="http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3563043"><span id="translatedtitle">Object Substitution Masking in Schizophrenia: An <span class="hlt">Event</span>-Related Potential <span class="hlt">Analysis</span></span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wynn, Jonathan K.; Mathis, Kristopher I.; Ford, Judith; Breitmeyer, Bruno G.; Green, Michael F.</p> <p>2013-01-01</p> <p>Schizophrenia patients exhibit deficits on visual processing tasks, including visual backward masking, and these impairments are related to deficits in higher-level processes. In the current study we used electroencephalography techniques to examine successive stages and pathways of visual processing in a specialized masking paradigm, four-dot masking, which involves masking by object substitution. Seventy-six schizophrenia patients and 66 healthy controls had <span class="hlt">event</span>-related potentials (ERPs) recorded during four-dot masking. Target visibility was manipulated by changing stimulus onset asynchrony (SOA) between the target and mask, such that performance decreased with increasing SOA. Three SOAs were used: 0, 50, and 100 ms. The P100 and N100 perceptual ERPs were examined. Additionally, the visual awareness negativity (VAN) to correct vs. incorrect responses, an index of reentrant processing, was examined for SOAs 50 and 100 ms. Results showed that patients performed worse than controls on the behavioral task across all SOAs. The ERP results revealed that patients had significantly smaller P100 and N100 amplitudes, though there was no effect of SOA on either component in either group. In healthy controls, but not patients, N100 amplitude correlated significantly with behavioral performance at SOAs where masking occurred, such that higher accuracy correlated with a larger N100. Healthy controls, but not patients, exhibited a larger VAN to correct vs. incorrect responses. The results indicate that the N100 appears to be related to attentional effort in the task in controls, but not patients. Considering that the VAN is thought to reflect reentrant processing, one interpretation of the findings is that patients’ lack of VAN response and poorer performance may be related to dysfunctional reentrant processing. PMID:23382723</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016EGUGA..1815559S&link_type=ABSTRACT','NASAADS'); return false;" href="http://adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=2016EGUGA..1815559S&link_type=ABSTRACT"><span id="translatedtitle">Dynamical <span class="hlt">Analysis</span> of Blocking <span class="hlt">Events</span>: Spatial and Temporal Fluctuations of Covariant Lyapunov Vectors</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schubert, Sebastian; Lucarini, Valerio</p> <p>2016-04-01</p> <p>One of the most relevant weather regimes in the mid latitudes atmosphere is the persistent deviation from the approximately zonally symmetric jet stream to the emergence of so-called blocking patterns. Such configurations are usually connected to exceptional local stability properties of the flow which come along with an improved local forecast skills during the phenomenon. It is instead extremely hard to predict onset and decay of blockings. Covariant Lyapunov Vectors (CLVs) offer a suitable characterization of the linear stability of a chaotic flow, since they represent the full tangent linear dynamics by a covariant basis which explores linear perturbations at all time scales. Therefore, we will test whether CLVs feature a signature of the blockings. We examine the CLVs for a quasi-geostrophic beta-plane two-layer model in a periodic channel baroclinically driven by a meridional temperature gradient ΔT. An orographic forcing enhances the emergence of localized blocked regimes. We detect the blocking <span class="hlt">events</span> of the channel flow with a Tibaldi-Molteni scheme adapted to the periodic channel. When blocking occurs, the global growth rates of the fastest growing CLVs are significantly higher. Hence against intuition, globally the circulation is more unstable in blocked phases. Such an increase in the finite time Lyapunov exponents with respect to the long term average is attributed to stronger barotropic and baroclinic conversion in the case of high temperature gradients, while for low values of ΔT, the effect is only due to stronger barotropic instability. For the localization of the CLVs, we compare the meridionally averaged variance of the CLVs during blocked and unblocked phases. We find that on average the variance of the CLVs is clustered around the center of blocking. These results show that the blocked flow affects all time scales and processes described by the CLVs.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.ncbi.nlm.nih.gov/pubmed/27228436','PUBMED'); return false;" href="http://www.ncbi.nlm.nih.gov/pubmed/27228436"><span id="translatedtitle">Heterogeneous recurrence <span class="hlt">analysis</span> of heartbeat dynamics for the identification of sleep apnea <span class="hlt">events</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cheng, Changqing; Kan, Chen; Yang, Hui</p> <p>2016-08-01</p> <p>Obstructive sleep apnea (OSA) is a common sleep disorder that affects 24% of adult men and 9% of adult women. It occurs due to the occlusion of the upper airway during sleep, thereby leading to a decrease of blood oxygen level that triggers arousals and sleep fragmentation. OSA significantly impacts the quality of sleep and it is known to be responsible for a number of health complications, such as high blood pressure and type 2 diabetes. Traditional diagnosis of OSA relies on polysomnography, w