Science.gov

Sample records for event analysis atheana

  1. The action characterization matrix: A link between HERA (Human Events Reference for ATHEANA) and ATHEANA (a technique for human error analysis)

    SciTech Connect

    Hahn, H.A.

    1997-12-22

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavior science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. ATHEANA is being developed in the context of nuclear power plant (NPP) PRAs, and much of the language used to describe the method and provide examples of its application are specific to that industry. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. Los Alamos National Laboratory`s (LANL) Human Factors Group has recently joined the ATHEANA project team; LANL is responsible for further developing the database structure and for analyzing additional exemplar operational events for entry into the database. The Action Characterization Matrix (ACM) is conceived as a bridge between the HERA database structure and ATHEANA. Specifically, the ACM allows each unsafe action or human failure event to be characterized according to its representation along each of six different dimensions: system status, initiator status, unsafe action mechanism, information processing stage, equipment/material conditions, and performance shaping factors. This report describes the development of the ACM and provides details on the structure and content of its dimensions.

  2. Technical Basis and Implementation Guidelines for a Technique for Human Event Analysis (ATHEANA)

    DTIC Science & Technology

    2000-05-01

    of operational events and from an attempt to reconcile observed human performance in the most serious of these events with existing theories of human...NRC and the utilities , and others raising questions about human performance, but the application of ATHEANA involves the integration of the issues...alternative strategies such as multiattribute decision analysis. However, in practical applications, once the risk and cost (and their uncertainty) are

  3. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    SciTech Connect

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  4. Human Events Reference for ATHEANA (HERA) Database Description and Preliminary User's Manual

    SciTech Connect

    Auflick, J.L.

    1999-08-12

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database (db) of analytical operational events, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  5. Human events reference for ATHEANA (HERA) database description and preliminary user`s manual

    SciTech Connect

    Auflick, J.L.; Hahn, H.A.; Pond, D.J.

    1998-05-27

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavioral science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. This report documents the initial development efforts for HERA.

  6. Discussion of Comments from a Peer Review of A Technique for Human Event Anlysis (ATHEANA)

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A,; Wreathall J.

    1999-01-28

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of draft NUREG- 1624, a peer review of the new human reliability analysis method its documentation and the results of an initial test of the method was held over a two-day period in June 1998 in Seattle, Washington. Four internationally known and respected experts in HK4 or probabilistic risk assessment were selected to serve as the peer reviewers. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. They were asked to focus on two major aspects: Are the basic premises of ATHEANA on solid ground and is the conceptual basis adequate? Is the ATHEANA implementation process adequate given the description of the intended users in the documentation? The four peer reviewers asked questions and provided oral comments during the peer review meeting and provided written comments approximately two weeks after the completion of the meeting. This paper discusses their major comments.

  7. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-10-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ``A Technique for Human Error Analysis`` (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst.

  8. Results of a nuclear power plant Application of a new technique for human error analysis (ATHEANA)

    SciTech Connect

    Forester, J.A.; Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1997-10-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the {open_quotes}success{close_quotes} of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator {open_quotes}on shift{close_quotes} until a few months before the demonstration. The demonstration was conducted over a 5 month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  9. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    SciTech Connect

    Whitehead, D.W.; Forester, J.A.; Bley, D.C.

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  10. Philosophy of ATHEANA

    SciTech Connect

    Bley, D.C.; Cooper, S.E.; Forester, J.A.; Kolaczkowski, A.M.; Ramey-Smith, A.; Thompson, C.M.; Whitehead, D.W.; Wreathall, J.

    1999-03-24

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  11. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  12. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    SciTech Connect

    Taylor, J.H.; Luckas, W.J.; Wreathall, J.

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  13. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    SciTech Connect

    Taylor, J.; O`Hara, J.; Luckas, W.

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  14. Concepts of event-by-event analysis

    SciTech Connect

    Stroebele, H.

    1995-07-15

    The particles observed in the final state of nuclear collisions can be divided into two classes: those which are susceptible to strong interactions and those which are not, like leptons and the photon. The bulk properties of the {open_quotes}matter{close_quotes} in the reaction zone may be read-off the kinematical characteristics of the particles observable in the final state. These characteristics are strongly dependent on the last interaction these particles have undergone. In a densly populated reaction zone strongly interacting particles will experience many collisions after they have been formed and before they emerge into the asymptotic final state. For the particles which are not sensitive to strong interactions their formation is also their last interaction. Thus photons and leptons probe the period during which they are produced whereas hadrons reflect the so called freeze-out processes, which occur during the late stage in the evolution of the reaction when the population density becomes small and the mean free paths long. The disadvantage of the leptons and photons is their small production cross section; they cannot be used in an analysis of the characteristics of individual collision events, because the number of particles produced per event is too small. The hadrons, on the other hand, stem from the freeze-out period. Information from earlier periods requires multiparticle observables in the most general sense. It is one of the challenges of present day high energy nuclear physics to establish and understand global observables which differentiate between mere hadronic scenarios, i.e superposition of hadronic interactions, and the formation of a partonic (short duration) steady state which can be considered a new state of matter, the Quark-Gluon Plasma.

  15. EVENT PLANNING USING FUNCTION ANALYSIS

    SciTech Connect

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  16. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  17. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  18. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  19. Human Reliability Analysis in the U.S. Nuclear Power Industry: A Comparison of Atomistic and Holistic Methods

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-09-01

    A variety of methods have been developed to generate human error probabilities for use in the US nuclear power industry. When actual operations data are not available, it is necessary for an analyst to estimate these probabilities. Most approaches, including THERP, ASEP, SLIM-MAUD, and SPAR-H, feature an atomistic approach to characterizing and estimating error. The atomistic approach is based on the notion that events and their causes can be decomposed and individually quantified. In contrast, in the holistic approach, such as found in ATHEANA, the analysis centers on the entire event, which is typically quantified as an indivisible whole. The distinction between atomistic and holistic approaches is important in understanding the nature of human reliability analysis quantification and the utility and shortcomings associated with each approach.

  20. Interpretation Analysis as a Competitive Event.

    ERIC Educational Resources Information Center

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  1. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  2. Dynamic Event Tree Analysis Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  3. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  4. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  5. Analysis of a Limb Eruptive Event

    NASA Astrophysics Data System (ADS)

    Kotrč, P. Kupryakov, Yu. A.; Bárta, M.; Kashapova, K., L.; Liu, W.

    2016-04-01

    We present the analysis of an eruptive event that took place on the eastern limb on April 21, 2015, which was observed by the Ondřejov horizontal telescope and spectrograph. The eruption of the highly twisted prominence was followed by the onset of soft X-ray sources. We identified the structures observed in Hα spectra with the details on the Hα filtergrams and analyzed the evolution of Doppler component velocities. The timing and observed characteristics of the eruption were compared with the prediction of the model based on the twisting of the flux ropes and the kink/torus instability.

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  7. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  8. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  9. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  10. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  11. Web Video Event Recognition by Semantic Analysis from Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng

    2016-09-27

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyse video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of web video event recognition, where web videos often describe largegranular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video datasets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of web video event recognition.

  12. [Analysis of Spontaneously Reported Adverse Events].

    PubMed

    Nakamura, Mitsuhiro

    2016-01-01

    Observational study is necessary for the evaluation of drug effectiveness in clinical practice. In recent years, the use of spontaneous reporting systems (SRS) for adverse drug reactions has increased and they have become an important resource for regulatory science. SRS, being the largest and most well-known databases worldwide, are one of the primary tools used for postmarketing surveillance and pharmacovigilance. To analyze SRS, the US Food and Drug Administration Adverse Event Reporting System (FAERS) and the Japanese Adverse Drug Event Report Database (JADER) are reviewed. Authorized pharmacovigilance algorithms were used for signal detection, including the reporting odds ratio. An SRS is a passive reporting database and is therefore subject to numerous sources of selection bias, including overreporting, underreporting, and a lack of a denominator. Despite the inherent limitations of spontaneous reporting, SRS databases are a rich resource and data mining index that provide powerful means of identifying potential associations between drugs and their adverse effects. Our results, which are based on the evaluation of SRS databases, provide essential knowledge that could improve our understanding of clinical issues.

  13. Peak event analysis: a novel empirical method for the evaluation of elevated particulate events

    PubMed Central

    2013-01-01

    Background We report on a novel approach to the analysis of suspended particulate data in a rural setting in southern Ontario. Analyses of suspended particulate matter and associated air quality standards have conventionally focussed on 24-hour mean levels of total suspended particulates (TSP) and particulate matter <10 microns, <2.5 microns and <1 micron in diameter (PM10, PM2.5, PM1, respectively). Less emphasis has been placed on brief peaks in suspended particulate levels, which may pose a substantial nuisance, irritant, or health hazard. These events may also represent a common cause of public complaint and concern regarding air quality. Methods Measurements of TSP, PM10, PM2.5, and PM1 levels were taken using an automated device following local complaints of dusty conditions in rural south-central Ontario, Canada. The data consisted of 126,051 by-minute TSP, PM10, PM2.5, and PM1 measurements between May and August 2012. Two analyses were performed and compared. First, conventional descriptive statistics were computed by month for TSP, PM10, PM2.5, and PM1, including mean values and percentiles (70th, 90th, and 95th). Second, a novel graphical analysis method, using density curves and line plots, was conducted to examine peak events occurring at or above the 99th percentile of per-minute TSP readings. We refer to this method as “peak event analysis”. Findings of the novel method were compared with findings from the conventional approach. Results Conventional analyses revealed that mean levels of all categories of suspended particulates and suspended particulate diameter ratios conformed to existing air quality standards. Our novel methodology revealed extreme outlier events above the 99th percentile of readings, with peak PM10 and TSP levels over 20 and 100 times higher than the respective mean values. Peak event analysis revealed and described rare and extreme peak dust events that would not have been detected using conventional descriptive statistics

  14. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  15. Event/Time/Availability/Reliability-Analysis Program

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas

    1994-01-01

    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  16. Video analysis of motor events in REM sleep behavior disorder.

    PubMed

    Frauscher, Birgit; Gschliesser, Viola; Brandauer, Elisabeth; Ulmer, Hanno; Peralta, Cecilia M; Müller, Jörg; Poewe, Werner; Högl, Birgit

    2007-07-30

    In REM sleep behavior disorder (RBD), several studies focused on electromyographic characterization of motor activity, whereas video analysis has remained more general. The aim of this study was to undertake a detailed and systematic video analysis. Nine polysomnographic records from 5 Parkinson patients with RBD were analyzed and compared with sex- and age-matched controls. Each motor event in the video during REM sleep was classified according to duration, type of movement, and topographical distribution. In RBD, a mean of 54 +/- 23.2 events/10 minutes of REM sleep (total 1392) were identified and visually analyzed. Seventy-five percent of all motor events lasted <2 seconds. Of these events, 1,155 (83.0%) were classified as elementary, 188 (13.5%) as complex behaviors, 50 (3.6%) as violent, and 146 (10.5%) as vocalizations. In the control group, 3.6 +/- 2.3 events/10 minutes (total 264) of predominantly elementary simple character (n = 240, 90.9%) were identified. Number and types of motor events differed significantly between patients and controls (P < 0.05). This study shows a very high number and great variety of motor events during REM sleep in symptomatic RBD. However, most motor events are minor, and violent episodes represent only a small fraction.

  17. Human performance analysis of industrial radiography radiation exposure events

    SciTech Connect

    Reece, W.J.; Hill, S.G.

    1995-12-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures.

  18. Glaciological parameters of disruptive event analysis

    SciTech Connect

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated.

  19. Nonlinear Analysis for Event Forewarning (NLAfEW)

    SciTech Connect

    Hively, Lee Mizener

    2013-05-23

    The NLAfEW computer code analyses noisy, experimental data to forewarn of adverse events. The functionality of the analysis is a follows: It removes artifacts from the data, converts the continuous data value to discrete values, constructs time-delay embedding vectors, comparents the unique nodes and links in one graph, and determines event forewarning on the basis of several successive occurrences of one (or more) of the dissimilarity measures above a threshold.

  20. Analysis of damaging hydrogeological events in a Mediterranean region (Calabria)

    NASA Astrophysics Data System (ADS)

    Aceto, Luigi; Caloiero, Tommaso; Pasqua, A. A.; Petrucci, Olga

    2016-10-01

    Damaging Hydrogeological Events (DHEs) are periods of severe weather conditions affecting wide areas for several days, and causing mainly damaging landslides and floods. In order to characterise the DHEs, we analysed the historical series of the events that affected a Mediterranean region (Calabria, southern Italy) throughout 92 years of observation. Depending on their magnitude, we classified the events as: major catastrophic, catastrophic, extraordinary and ordinary. In winter events, damaged areas and damage were greater than those resulting from the other seasons. Nevertheless, the majority of the events took place in autumn, when, in addition to landslides, a relevant percentage of flash floods and floods also occurred. Results also show that the frequency of major catastrophic and catastrophic events has decreased since 1971, and that, in recent decades, Calabria has suffered from damaging effects even though rain did not reached extreme characteristics. In fact, the duration of triggering rain, the maximum daily rain of the events and the out coming frequency of the high return period of rain show a decreasing pattern throughout the study period. As to what concerns the damaging phenomena, landslides were identified as the most frequent in every season and in every type of events, the eastern side of the region was the most frequently and heavily damaged. According to literature, the trend of number of victims per event is also decreasing. The proposed analysis can be applied to different study areas in order to assess the relative magnitude of DHEs and their evolution throughout the years. The classification criterion can be useful to compare different events for either scientific or insurance purposes, and to identify the typical rainfall-damage scenario of a study area.

  1. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  2. External events analysis for the Savannah River Site K reactor

    SciTech Connect

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{sup {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.

  3. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture.

  4. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  5. Offense Specialization of Arrestees: An Event History Analysis

    ERIC Educational Resources Information Center

    Lo, Celia C.; Kim, Young S.; Cheng, Tyrone C.

    2008-01-01

    The data set employed in the present study came from interviews with arrestees conducted between 1999 and 2001 as well as from their official arrest records obtained from jail administrators. A total of 238 arrestees ages 18 to 25 constituted the final sample. Event history analysis examined each arrestee's movement from periods of no arrests to…

  6. Predicting Retention of Drug Court Participants Using Event History Analysis

    ERIC Educational Resources Information Center

    Wolf, Elaine M.; Sowards, Kathryn A.; Wolf, Douglas A.

    2003-01-01

    This paper presents the results of a discrete-time event-history analysis of the relationships between client and program characteristics and the length and outcome of participation in a drug court program. We identify factors associated with both successful completion and premature termination. Having an African-American case manager, being…

  7. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    PubMed

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2016-03-28

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  8. Analysis of recurrent event data with incomplete observation gaps.

    PubMed

    Kim, Yang-Jin; Jhun, Myoungshic

    2008-03-30

    In analysis of recurrent event data, recurrent events are not completely experienced when the terminating event occurs before the end of a study. To make valid inference of recurrent events, several methods have been suggested for accommodating the terminating event (Statist. Med. 1997; 16:911-924; Biometrics 2000; 56:554-562). In this paper, our interest is to consider a particular situation, where intermittent dropouts result in observation gaps during which no recurrent events are observed. In this situation, risk status varies over time and the usual definition of risk variable is not applicable. In particular, we consider the case when information on the observation gap is incomplete, that is, the starting time of intermittent dropout is known but the terminating time is not available. This incomplete information is modeled in terms of an interval-censored mechanism. Our proposed method is applied to the study of the Young Traffic Offenders Program on conviction rates, wherein a certain proportion of subjects experienced suspensions with intermittent dropouts during the study.

  9. Industrial accidents triggered by flood events: analysis of past accidents.

    PubMed

    Cozzani, Valerio; Campedel, Michela; Renni, Elisabetta; Krausmann, Elisabeth

    2010-03-15

    Industrial accidents triggered by natural events (NaTech accidents) are a significant category of industrial accidents. Several specific elements that characterize NaTech events still need to be investigated. In particular, the damage mode of equipment and the specific final scenarios that may take place in NaTech accidents are key elements for the assessment of hazard and risk due to these events. In the present study, data on 272 NaTech events triggered by floods were retrieved from some of the major industrial accident databases. Data on final scenarios highlighted the presence of specific events, as those due to substances reacting with water, and the importance of scenarios involving consequences for the environment. This is mainly due to the contamination of floodwater with the hazardous substances released. The analysis of process equipment damage modes allowed the identification of the expected release extents due to different water impact types during floods. The results obtained were used to generate substance-specific event trees for the quantitative assessment of the consequences of accidents triggered by floods.

  10. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    SciTech Connect

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-11-22

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility.

  11. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  12. Time-quefrency analysis of overlapping similar microseismic events

    NASA Astrophysics Data System (ADS)

    Nagano, Koji

    2016-05-01

    In this paper, I describe a new technique to determine the interval between P-waves in similar, overlapping microseismic events. The similar microseismic events that occur with overlapping waveforms are called `proximate microseismic doublets' herein. Proximate microseismic doublets had been discarded in previous studies because we had not noticed their usefulness. Analysis of similar events can show relative locations of sources between them. Analysis of proximate microseismic doublets can provide more precise relative source locations because variation in the velocity structure has little influence on their relative travel times. It is necessary to measure the interval between the P-waves in the proximate microseismic doublets to determine their relative source locations. A `proximate microseismic doublet' is a pair of microseismic events in which the second event arrives before the attenuation of the first event. Cepstrum analysis can provide the interval even though the second event overlaps the first event. However, a cepstrum of a proximate microseismic doublet generally has two peaks, one representing the interval between the arrivals of the two P-waves, and the other representing the interval between the arrivals of the two S-waves. It is therefore difficult to determine the peak that represents the P-wave interval from the cepstrum alone. I used window functions in cepstrum analysis to isolate the first and second P-waves and to suppress the second S-wave. I change the length of the window function and calculate the cepstrum for each window length. The result is represented in a three-dimensional contour plot of length-quefrency-cepstrum data. The contour plot allows me to identify the cepstrum peak that represents the P-wave interval. The precise quefrency can be determined from a two-dimensional quefrency-cepstrum graph, provided that the length of the window is appropriately chosen. I have used both synthetic and field data to demonstrate that this

  13. Drug Discrimination and the Analysis of Private Events

    PubMed Central

    Kangas, Brian D.; Maguire, David R.

    2016-01-01

    A defining feature of radical behaviorism is the explicit inclusion of private events as material phenomena within a science of behavior. Surprisingly, however, despite much theorizing, there is a notable paucity within behavior analysis of controlled experimentation and analysis of private events, especially in nonhuman animals. One technique that is amenable to the study of private events is drug discrimination. For over 40 years, drug discrimination procedures have been an incredibly effective tool providing a wealth of in vivo pharmacological information about drugs including receptor selectivity, potency, and efficacy. In addition, this procedure has provided important preclinical indications of abuse liability. However, despite its prowess as a pharmacologic tool, or perhaps because of it, empirical investigation of its parameters, procedural elements, and variants is not currently an active research domain. This review highlights the drug discrimination procedure as a powerful means to systematically investigate private events by using drugs as interoceptive stimuli. In addition to the opportunity to study privacy, empirical evaluation of the drug discrimination procedure will likely inform and improve the standard practice for future endeavors in basic and clinical pharmacology. PMID:27928551

  14. An analysis of selected atmospheric icing events on test cables

    SciTech Connect

    Druez, J.; McComber, P.; Laflamme, J.

    1996-12-01

    In cold countries, the design of transmission lines and communication networks requires the knowledge of ice loads on conductors. Atmospheric icing is a stochastic phenomenon and therefore probabilistic design is used more and more for structure icing analysis. For strength and reliability assessments, a data base on atmospheric icing is needed to characterize the distributions of ice load and corresponding meteorological parameters. A test site where icing is frequent is used to obtain field data on atmospheric icing. This test site is located on the Mt. Valin, near Chicoutimi, Quebec, Canada. The experimental installation is mainly composed of various instrumented but non-energized test cables, meteorological instruments, a data acquisition system, and a video recorder. Several types of icing events can produce large ice accretions dangerous for land-based structures. They are rime due to in-cloud icing, glaze caused by freezing rain, wet snow, and mixtures of these types of ice. These icing events have very different characteristics and must be distinguished, before statistical analysis, in a data base on atmospheric icing. This is done by comparison of data from a precipitation gauge, an icing rate meter and a temperature sensor. An analysis of selected icing periods recorded on the cables of two perpendicular test lines during the 1992--1993 winter season is presented. Only significant icing events have been considered. A comparative analysis of the ice load on the four test cables is drawn from the data, and typical accretion and shedding parameters are calculated separately for icing events related to in-cloud icing and precipitation icing.

  15. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  16. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  17. Topological Analysis of Emerging Bipole Clusters Producing Violent Solar Events

    NASA Astrophysics Data System (ADS)

    Mandrini, C. H.; Schmieder, B.; Démoulin, P.; Guo, Y.; Cristiani, G. D.

    2014-06-01

    During the rising phase of Solar Cycle 24 tremendous activity occurred on the Sun with rapid and compact emergence of magnetic flux leading to bursts of flares (C to M and even X-class). We investigate the violent events occurring in the cluster of two active regions (ARs), NOAA numbers 11121 and 11123, observed in November 2010 with instruments onboard the Solar Dynamics Observatory and from Earth. Within one day the total magnetic flux increased by 70 % with the emergence of new groups of bipoles in AR 11123. From all the events on 11 November, we study, in particular, the ones starting at around 07:16 UT in GOES soft X-ray data and the brightenings preceding them. A magnetic-field topological analysis indicates the presence of null points, associated separatrices, and quasi-separatrix layers (QSLs) where magnetic reconnection is prone to occur. The presence of null points is confirmed by a linear and a non-linear force-free magnetic-field model. Their locations and general characteristics are similar in both modelling approaches, which supports their robustness. However, in order to explain the full extension of the analysed event brightenings, which are not restricted to the photospheric traces of the null separatrices, we compute the locations of QSLs. Based on this more complete topological analysis, we propose a scenario to explain the origin of a low-energy event preceding a filament eruption, which is accompanied by a two-ribbon flare, and a consecutive confined flare in AR 11123. The results of our topology computation can also explain the locations of flare ribbons in two other events, one preceding and one following the ones at 07:16 UT. Finally, this study provides further examples where flare-ribbon locations can be explained when compared to QSLs and only, partially, when using separatrices.

  18. Velocity analysis with local event slopes related probability density function

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Lu, Wenkai; Zhang, Yingqiang

    2015-12-01

    Macro velocity model plays a key role in seismic imaging and inversion. The performance of traditional velocity analysis methods is degraded by multiples and amplitude-versus-offset (AVO) anomalies. Local event slopes, containing the subsurface velocity information, have been widely used to accomplish common time-domain seismic processing, imaging and velocity estimation. In this paper, we propose a method for velocity analysis with probability density function (PDF) related to local event slopes. We first estimate local event slopes with phase information in the Fourier domain. An adaptive filter is applied to improve the performance of slopes estimator in the low signal-to-noise ratio (SNR) situation. Second, the PDF is approximated with the histogram function, which is related to attributes derived from local event slopes. As a graphical representation of the data distribution, the histogram function can be computed efficiently. By locating the ray path of the first arrival on the semblance image with straight-ray segments assumption, automatic velocity picking is carried out to establish velocity model. Unlike local event slopes based velocity estimation strategies such as averaging filters and image warping, the proposed method does not make the assumption that the errors of mapped velocity values are symmetrically distributed or that the variation of amplitude along the offset is slight. Extension of the method to prestack time-domain migration velocity estimation is also given. With synthetic and field examples, we demonstrate that our method can achieve high resolution, even in the presence of multiples, strong amplitude variations and polarity reversals.

  19. Analysis of Neuropsychiatric Adverse Events in Patients Treated with Oseltamivir in Spontaneous Adverse Event Reports.

    PubMed

    Ueda, Natsumi; Umetsu, Ryogo; Abe, Junko; Kato, Yamato; Nakayama, Yoko; Kato, Zenichiro; Kinosada, Yasutomi; Nakamura, Mitsuhiro

    2015-01-01

    There have been concerns that oseltamivir causes neuropsychiatric adverse events (NPAEs). We analyzed the association of age and gender with NPAEs in patients treated with oseltamivir using a logistic regression model. NPAE data were obtained from the U.S. Food and Drug Administration Adverse Event Reporting System (2004 to 2013). The lower limit of the reporting odds ratio (ROR) 95% confidence interval (CI) of "abnormal behavior" in Japan, Singapore, and Taiwan was ≥1. The effects of the interaction terms for oseltamivir in male patients aged 10-19 years were statistically significant. The adjusted ROR of "abnormal behavior" was 96.4 (95% CI, 77.5-119.9) in male patients aged 10-19 years treated with osletamivir. In female patients, the results of the likelihood ratio test for "abnormal behavior" were not statistically significant. The adjusted NPAE RORs were increased in male and female patients under the age of 20 years. Oseltamivir use could be associated with "abnormal behavior" in males aged 10-19 years. After considering the causality restraints of the current analysis, further epidemiological studies are recommended.

  20. Performance Analysis: Work Control Events Identified January - August 2010

    SciTech Connect

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009

  1. Advanced analysis and event reconstruction for the CTA Observatory

    NASA Astrophysics Data System (ADS)

    Becherini, Y.; Khélifi, B.; Pita, S.; Punch, M.; CTA Consortium

    2012-12-01

    The planned Cherenkov Telescope Array (CTA) is a future observatory for very-high-energy (VHE) gamma-ray astronomy composed of one site per hemisphere [1]. It aims at 10 times better sensitivity, a better angular resolution and wider energy coverage than current installations such as H.E.S.S., MAGIC and VERITAS. In order to achieve this level of performance, both the design of the telescopes and the analysis algorithms are being studied and optimized within the CTA Monte-Carlo working group. Here, we present ongoing work on the data analysis for both the event reconstruction (energy, direction) and gamma/hadron separation, carried out within the HAP (H.E.S.S. Analysis Package) software framework of the H.E.S.S. collaboration, for this initial study. The event reconstruction uses both Hillas-parameter-based algorithms and an improved version of the 3D-Model algorithm [2]. For the gamma/hadron discrimination, original and robust discriminant variables are used and treated with Boosted Decision Trees (BDTs) in the TMVA [3] (Toolkit for Multivariate Data Analysis) framework. With this advanced analysis, known as Paris-MVA [4], the sensitivity is improved by a factor of ~ 2 in the core range of CTA relative to the standard analyses. Here we present the algorithms used for the reconstruction and discrimination, together with the resulting performance characteristics, with good confidence, since the method has been successfully applied for H.E.S.S.

  2. Mixed-effects Poisson regression analysis of adverse event reports

    PubMed Central

    Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

    2008-01-01

    SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

  3. A Dendrochronological Analysis of Mississippi River Flood Events

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Bialecki, M. B.; Peters, C.

    2012-12-01

    We used a novel tree-ring record of anatomically anomalous "flood rings" preserved in Oak (Quercus sp.) trees growing downstream of the Mississippi and Ohio River confluence to identify spring (MAM) flood events on the lower Mississippi River from C.E. 1694-2009. Our chronology includes virtually all of the observed high-magnitude spring floods of the 20th century as well as similar flood events in prior centuries occurring on the Mississippi River adjacent to the Birds Point-New Madrid Floodway. A response index analysis indicates that over half of the floods identified caused anatomical injury to well over 50% of the sampled trees and many of the greatest flood events are recorded by more than 80% of the trees at the site including 100% of the trees in the great flood of 1927. Twenty-five of the 40 floods identified as flood rings in the tree-ring record, occur during the instrumental observation period at New Madrid, Missouri (1879-2009), and comparison of the response index with average daily river stage height values indicates that the flood ring record can explain significant portions of the variance in both stage height (30%) and number of days in flood (40%) during spring flood events. The flood ring record also suggests that high-magnitude spring flooding is episodic and linked to basin-scale pluvial events driven by decadal-scale variability of the Pacific/North American pattern (PNA). This relationship suggests that the tree-ring record of flooding may also be used as a proxy record of atmospheric variability related to the PNA and related large-scale forcing.

  4. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  5. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    Dana L. Kelly; Dale M. Rasmuson

    2008-09-01

    This paper describes the approach taken by the U. S. Nuclear Regulatory Commission to the treatment of common-cause failure in probabilistic risk assessment of operational events. The approach is based upon the Basic Parameter Model for common-cause failure, and examples are illustrated using the alpha-factor parameterization, the approach adopted by the NRC in their Standardized Plant Analysis Risk (SPAR) models. The cases of a failed component (with and without shared common-cause failure potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g., failure to start and failure to run) is a new feature of this paper. These methods are being applied by the NRC in assessing the risk significance of operational events for the Significance Determination Process (SDP) and the Accident Sequence Precursor (ASP) program.

  6. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  7. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  8. Computational particle physics for event generators and data analysis

    NASA Astrophysics Data System (ADS)

    Perret-Gallix, Denis

    2013-08-01

    High-energy physics data analysis relies heavily on the comparison between experimental and simulated data as stressed lately by the Higgs search at LHC and the recent identification of a Higgs-like new boson. The first link in the full simulation chain is the event generation both for background and for expected signals. Nowadays event generators are based on the automatic computation of matrix element or amplitude for each process of interest. Moreover, recent analysis techniques based on the matrix element likelihood method assign probabilities for every event to belong to any of a given set of possible processes. This method originally used for the top mass measurement, although computing intensive, has shown its efficiency at LHC to extract the new boson signal from the background. Serving both needs, the automatic calculation of matrix element is therefore more than ever of prime importance for particle physics. Initiated in the 80's, the techniques have matured for the lowest order calculations (tree-level), but become complex and CPU time consuming when higher order calculations involving loop diagrams are necessary like for QCD processes at LHC. New calculation techniques for next-to-leading order (NLO) have surfaced making possible the generation of processes with many final state particles (up to 6). If NLO calculations are in many cases under control, although not yet fully automatic, even higher precision calculations involving processes at 2-loops or more remain a big challenge. After a short introduction to particle physics and to the related theoretical framework, we will review some of the computing techniques that have been developed to make these calculations automatic. The main available packages and some of the most important applications for simulation and data analysis, in particular at LHC will also be summarized (see CCP2012 slides [1]).

  9. Analysis of a new extreme precipitation event in Reykjavik

    NASA Astrophysics Data System (ADS)

    Ólafsson, Haraldur; Ágústsson, Hálfdán

    2013-04-01

    On 28-29 December 2012 a new precipitation record of 70.4 mm in 24 hours was made in Reykjavik, Iceland. This extreme event is explored by means of observations and by numerical simulations by different models and different times of initialization. Several key factors in creating the precipitation extreme are identified: a) Slowly moving upper level low with high values of vorticity and vorticity advection. b) A south to north low-level temperature gradient set up by cold avection in the wake of a surface low and warm advection in easterly flow over Iceland, enhanced by the topography (foehn). This temperature gradient leads to strong vertical windshear with very weak winds at the surface, but up to 40 m/s from the SE in the upper troposphere. As there are no strong winds at low levels, there is hardly any precipitation shadow in Reykjavik, downstream of the Reykjanes mountains. In terms of considerable, but not extreme precipitation, the event was in general reasonably well forecasted 24 to 48 hours ahead. The above analysis leads to a method to identify extreme precipitation of this kind in large scale models. The method will be used to investigate the frequency of similar events in future climate scenarios.

  10. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  11. An analysis of three nuclear events in P-Tunnel

    SciTech Connect

    Fourney, W.L.; Dick, R.D.; Taylor, S.R.; Weaver, T.A.

    1994-05-03

    This report examines experimental results obtained from three P Tunnel events -- Mission Cyber, Disko Elm, and Distant Zenith. The objective of the study was to determine if there were any differences in the explosive source coupling for the three events. It was felt that Mission Cyber might not have coupled well because the ground motions recorded for that event were much lower than expected based on experience from N Tunnel. Detailed examination of the physical and chemical properties of the tuff in the vicinity of each explosion indicated only minor differences. In general, the core samples are strong and competent out to at least 60 m from each working point. Qualitative measures of core sample strength indicate that the strength of the tuff near Mission Cyber may be greater than indicated by results of static testing. Slight differences in mineralogic content and saturation of the Mission Cyber tuff were noted relative to the other two tests, but probably would not result in large differences in ground motions. Examination of scaled free-field stress and acceleration records collected by Sandia National Laboratory (SNL) indicated that Disko Elm showed the least scatter and Distant Zenith the most scatter. Mission Cyber measurements tend to lie slightly below those of Distant Zenith, but still within two standard deviations. Analysis of regional seismic data from networks operated by Lawrence Livermore National Laboratory (LLNL) and SNL also show no evidence of Mission Cyber coupling low relative to the other two events. The overall conclusion drawn from the study is that there were no basic differences in the way that the explosions coupled to the rock.

  12. Multivariate cluster analysis of forest fire events in Portugal

    NASA Astrophysics Data System (ADS)

    Tonini, Marj; Pereira, Mario; Vega Orozco, Carmen; Parente, Joana

    2015-04-01

    Portugal is one of the major fire-prone European countries, mainly due to its favourable climatic, topographic and vegetation conditions. Compared to the other Mediterranean countries, the number of events registered here from 1980 up to nowadays is the highest one; likewise, with respect to the burnt area, Portugal is the third most affected country. Portuguese mapped burnt areas are available from the website of the Institute for the Conservation of Nature and Forests (ICNF). This official geodatabase is the result of satellite measurements starting from the year 1990. The spatial information, delivered in shapefile format, provides a detailed description of the shape and the size of area burnt by each fire, while the date/time information relate to the ignition fire is restricted to the year of occurrence. In terms of a statistical formalism wildfires can be associated to a stochastic point process, where events are analysed as a set of geographical coordinates corresponding, for example, to the centroid of each burnt area. The spatio/temporal pattern of stochastic point processes, including the cluster analysis, is a basic procedure to discover predisposing factorsas well as for prevention and forecasting purposes. These kinds of studies are primarily focused on investigating the spatial cluster behaviour of environmental data sequences and/or mapping their distribution at different times. To include both the two dimensions (space and time) a comprehensive spatio-temporal analysis is needful. In the present study authors attempt to verify if, in the case of wildfires in Portugal, space and time act independently or if, conversely, neighbouring events are also closer in time. We present an application of the spatio-temporal K-function to a long dataset (1990-2012) of mapped burnt areas. Moreover, the multivariate K-function allowed checking for an eventual different distribution between small and large fires. The final objective is to elaborate a 3D

  13. Analytic Perturbation Analysis of Discrete Event Dynamic Systems

    SciTech Connect

    Uryasev, S.

    1994-09-01

    This paper considers a new Analytic Perturbation Analysis (APA) approach for Discrete Event Dynamic Systems (DEDS) with discontinuous sample-path functions with respect to control parameters. The performance functions for DEDS usually are formulated as mathematical expectations, which can be calculated only numerically. APA is based on new analytic formulas for the gradients of expectations of indicator functions; therefore, it is called an analytic perturbation analysis. The gradient of performance function may not coincide with the expectation of a gradient of sample-path function (i.e., the interchange formula for the gradient and expectation sign may not be valid). Estimates of gradients can be obtained with one simulation run of the models.

  14. Systematic Analysis of Adverse Event Reports for Sex Differences in Adverse Drug Events

    PubMed Central

    Yu, Yue; Chen, Jun; Li, Dingcheng; Wang, Liwei; Wang, Wei; Liu, Hongfang

    2016-01-01

    Increasing evidence has shown that sex differences exist in Adverse Drug Events (ADEs). Identifying those sex differences in ADEs could reduce the experience of ADEs for patients and could be conducive to the development of personalized medicine. In this study, we analyzed a normalized US Food and Drug Administration Adverse Event Reporting System (FAERS). Chi-squared test was conducted to discover which treatment regimens or drugs had sex differences in adverse events. Moreover, reporting odds ratio (ROR) and P value were calculated to quantify the signals of sex differences for specific drug-event combinations. Logistic regression was applied to remove the confounding effect from the baseline sex difference of the events. We detected among 668 drugs of the most frequent 20 treatment regimens in the United States, 307 drugs have sex differences in ADEs. In addition, we identified 736 unique drug-event combinations with significant sex differences. After removing the confounding effect from the baseline sex difference of the events, there are 266 combinations remained. Drug labels or previous studies verified some of them while others warrant further investigation. PMID:27102014

  15. Collective analysis of ORPS-reportable electrical events (June, 2005-August 2009)

    SciTech Connect

    Henins, Rita J; Hakonson - Hayes, Audrey C

    2010-01-01

    The analysis of LANL electrical events between June 30, 2005 and August 31, 2009 provides data that indicate some potential trends regarding ISM failure modes, activity types associated with reportable electrical events, and ORPS causal codes. This report discusses the identified potential trends for Shock events and compares attributes of the Shock events against Other Electrical events and overall ORPS-reportable events during the same time frame.

  16. Whole-Genome Analysis of Gene Conversion Events

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Hao; Zhang, Yu; Hardison, Ross; Miller, Webb

    Gene conversion events are often overlooked in analyses of genome evolution. In a conversion event, an interval of DNA sequence (not necessarily containing a gene) overwrites a highly similar sequence. The event creates relationships among genomic intervals that can confound attempts to identify orthologs and to transfer functional annotation between genomes. Here we examine 1,112,202 paralogous pairs of human genomic intervals, and detect conversion events in about 13.5% of them. Properties of the putative gene conversions are analyzed, such as the lengths of the paralogous pairs and the spacing between their sources and targets. Our approach is illustrated using conversion events in the beta-globin gene cluster.

  17. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    SciTech Connect

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  18. Cluster analysis of indermediate deep events in the southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2015-04-01

    The Hellenic subduction zone (HSZ) is the seismically most active region in Europe where the oceanic African litosphere is subducting beneath the continental Aegean plate. Although there are numerous studies of seismicity in the HSZ, very few focus on the eastern HSZ and the Wadati-Benioff-Zone of the subducting slab in that part of the HSZ. In order to gain a better understanding of the geodynamic processes in the region a dense local seismic network is required. From September 2005 to March 2007, the temporary seismic network EGELADOS has been deployed covering the entire HSZ. It consisted of 56 onshore and 23 offshore broadband stations with addition of 19 stations from GEOFON, NOA and MedNet to complete the network. Here, we focus on a cluster of intermediate deep seismicity recorded by the EGELADOS network within the subducting African slab in the region of the Nysiros volcano. The cluster consists of 159 events at 80 to 190 km depth with magnitudes between 0.2 and 4.1 that were located using nonlinear location tool NonLinLoc. A double-difference earthquake relocation using the HypoDD software is performed with both manual readings of onset times and differential traveltimes obtained by separate cross correlation of P- and S-waveforms. Single event locations are compared to relative relocations. The event hypocenters fall into a thin zone close to the top of the slab defining its geometry with an accuracy of a few kilometers. At intermediate depth the slab is dipping towards the NW at an angle of about 30°. That means it is dipping steeper than in the western part of the HSZ. The edge of the slab is clearly defined by an abrupt disappearance of intermediate depths seismicity towards the NE. It is found approximately beneath the Turkish coastline. Furthermore, results of a cluster analysis based on the cross correlation of three-component waveforms are shown as a function of frequency and the spatio-temporal migration of the seismic activity is analysed.

  19. The Tunguska event and Cheko lake origin: dendrochronological analysis

    NASA Astrophysics Data System (ADS)

    Rosanna, Fantucci; Romano, Serra; Gunther, Kletetschka; Mario, Di Martino

    2015-07-01

    Dendrochronological research was carried out on 23 trees samples (Larix sibirica and Picea obovata) sampled during the 1999 expedition in two locations, close to the epicentre zone and near Cheko lake (N 60°57', E 101°51'). Basal Area Increment (BAI) analysis has shown a general long growth suppression before 1908, the year of Tunguska event (TE), followed by a sudden growth increase due to diminished competition of trees that died due to the event. In one group of the trees, we detected growth decrease for several years (due to damage to the trunk, branches and crown), followed by growth increase during the following 4-14 years. We show that trees that germinated after the TE, and living in close proximity of Cheko lake (Cheko lake trees) had different behaviour patterns when compared to those trees living further from Cheko lake, inside the forest (Forest trees). Cheko lake trees have shown a vigorous continuous growth increase. Forest trees have shown a vigorous growth during the first 10-30 years of age, followed by a period of suppressed growth. We interpret the suppressed growth by the re-established competition with the surroundings trees. Cheko lake pattern, however, is consistent with the formation of the lake at the time of TE. This observation supports the hypothesis that Cheko lake formation is due to a fragment originating during TE, creating a small impact crater into the permafrost and soft alluvial deposits of Kimku River plain. This is further supported by the fact that Cheko lake has an elliptical shape elongated towards the epicentre of TE.

  20. Analysis and Simulations of Space Radiation Induced Single Event Transients

    NASA Astrophysics Data System (ADS)

    Perez, Reinaldo

    2016-05-01

    Spacecraft electronics are affected by the space radiation environment. Among the different types of radiation effects that can affect spacecraft electronics is the single event transients. The space environment is responsible for many of the single event transients which can upset the performance of the spacecraft avionics hardware. In this paper we first explore the origins of single event transients, then explore the modeling of a single event transient in digital and analog circuit. The paper also addresses the concept of crosstalk that could develop among digital circuits in the present of a SET event. The paper ends with a brief discussion of SET hardening. The goal of the paper is to provide methodologies for assessing single event transients and their effects so that spacecraft avionics engineers can develop either hardware or software countermeasures in their designs.

  1. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  2. Teleseismic Events Analysis with AQDB and ITAB stations, Brazil

    NASA Astrophysics Data System (ADS)

    Felício, L. D.; Vasconcello, E.; Assumpção, M.; Rodrigues, F.; Facincani, E.; Dias, F.

    2013-05-01

    This work aims to preferentially conduct the survey of seismic activity coming from the Andean region at distance over 1500 km recorded by Brazilian seismographic stations of AQDB and ITAB in 2012. The stations are located in the cities of Aquidauana and Itajai, both in central-west region in Brazil, with coordinates -20°48'S;-55°70'W and -27°24'S;-52°13'W, respectively. We determined the magnitudes mb and Ms,epicentral distance, arrival times of P waves experimental and theoretical (using IASP91 model) . With the programs SAC (SEISMIC ANALYSIS CODE), TAUP and Seisgram (Seismogram Viewer), it was possible to determine the mentioned magnitudes. We identified around twenty events for each station and it was possible to correlate the magnitude data published in the Bulletin National Earthquake Information Center (NEIC) generating a correlation between the calculated magnitudes (AQDB and ITAB).. The linear regression shows that the two stations mb and Ms magnitude are close to the values reported by the NEIC (97.1% correlation mb and Ms 96.5%). Regarding the P-wave arrive times at stations ITAB and AQDB indicate an average variation of 2.2 and 2.7 seconds respectively, in other words, the time difference of the waves P (experimental and theoretical) may be related to positioning each station and the heterogeneity of the structure and composition of the rocky massive in each region.

  3. Human Reliability Analysis for Small Modular Reactors

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  4. Descriptive Analysis of Air Force Non-Fatal Suicide Events

    DTIC Science & Technology

    2006-07-01

    hospitalizations is therefore severely limited. 2 RESULTS Matched Records As described in Table 1, the Capture dataset contained 1089 NFSE and the Recapture...surveillance database. Specifically, of the 1089 NFSE in the Capture dataset, 658 (60.4%) had a corresponding entry in SADR. When suicide event-related E...SADR SESS events matched to SADR data +/- 1089 1, 2, 3 days from event date Recapture SESS, SADR/SIDR pulled by E code in E950- 1842 SADR, SIDR E959

  5. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  6. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  7. BAYESIAN ANALYSIS OF REPEATED EVENTS USING EVENT-DEPENDENT FRAILTY MODELS: AN APPLICATION TO BEHAVIORAL OBSERVATION DATA

    PubMed Central

    Snyder, James

    2009-01-01

    In social interaction studies, one commonly encounters repeated displays of behaviors along with their duration data. Statistical methods for the analysis of such data use either parametric (e.g., Weibull) or semi-nonparametric (e.g., Cox) proportional hazard models, modified to include random effects (frailty) which account for the correlation of repeated occurrences of behaviors within a unit (dyad). However, dyad-specific random effects by themselves are not able to account for the ordering of event occurrences within dyads. The occurrence of an event (behavior) can make further occurrences of the same behavior to be more or less likely during an interaction. This paper develops event-dependent random effects models for analyzing repeated behaviors data using a Bayesian approach. The models are illustrated by a dataset relating to emotion regulation in families with children who have behavioral or emotional problems. PMID:20161593

  8. Regional Frequency Analysis of extreme rainfall events, Tuscany (Italy)

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Chiarello, V.; Rossi, G.

    2014-12-01

    The assessment of extreme hydrological events at sites characterized by short time series or where no data record exists has been mainly obtained by regional models. Regional frequency analysis based on the index variable procedure is implemented here to describe the annual maximum of rainfall depth of short durations in Tuscany region. The probability distribution TCEV - Two Component Extreme Value is used in the frame of the procedure for the parameters estimation based on a three levels hierarchical approach. The methodology deal with the delineation of homogeneous regions, the identification of a robust regional frequency distribution and the assessment of the scale factor, i.e. the index rainfall. The data set includes the annual maximum of daily rainfall of 351 gauge stations with at least 30 years of records, in the period 1916 - 2012, and the extreme rainfalls of short duration, 1 hour and 3, 6, 12, 24 hours. Different subdivisions hypotheses have been verified. A four regions subdivision, coincident with four subregions, which takes into account the orography, the geomorphological and climatic peculiarities of the Tuscany region, has been adopted. Particularly, for testing the regional homogeneity, the cumulate frequency distributions of the observed skewness and variation coefficients of the recorded times series, are compared with the theoretical frequency distribution obtained through a Monte Carlo technique. The related L-skewness and L-variation coefficients are also examined. The application of the Student t -test and the Wilcoxon test for the mean, as well as the χ2 was also performed. Further tests of subdivision hypotheses have been made through the application of discordancy D and heterogeneity H tests and the analysis of the observed and the theoretical TCEV model growth curves. For each region the daily rainfall growth curve has been estimated. The growth curves for the hourly duration have been estimated when the daily rainfall growth curve

  9. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    PubMed Central

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological model has been profitably expanded. A third area of expansion, the analysis of setting events, is proposed by the authors. The historical development of setting events as a behavior influence concept is traced. Modifications of the basic applied behavior analysis methodology and conceptual systems that seem necessary to setting event analysis are discussed and examples of descriptive and experimental setting event analyses are presented. PMID:16795646

  10. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  11. Statistical language analysis for automatic exfiltration event detection.

    SciTech Connect

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  12. CDAW 9 analysis of magnetospheric events on May 3, 1986 - Event C

    NASA Technical Reports Server (NTRS)

    Baker, D. N.; Pulkkinen, T. I.; Mcpherron, R. L.; Craven, J. D.; Frank, L. A.; Elphinstone, R. D.; Murphree, J. S.; Fennell, J. F.; Lopez, R. E.; Nagai, T.

    1993-01-01

    An intense geomagnetic substorm event on May 3, 1986, occurring toward the end of a strong storm period, is studied. The auroral electrojet indices and global imaging data from both the Northern and Southern Hemispheres clearly revealed the growth phase and expansion phase development for a substorm with an onset at 0111 UT. An ideally located constellation of four spacecraft allowed detailed observation of the substorm growth phase in the near-tail region. A realistic time-evolving magnetic field model provided a global representation of the field configuration throughout the growth and early expansion phase of the substorm. Evidence of a narrowly localized substorm onset region in the near-earth tail is found. This region spread rapidly eastward and poleward after the 0111 UT onset. The results are consistent with a model of late growth phase formation of a magnetic neutral line. This reconnection region caused plasma sheet current diversion before the substorm onset and eventually led to cross-tail current disruption at the time of the substorm onset.

  13. Subchannel analysis of multiple CHF events. [PWR; BWR

    SciTech Connect

    Reddy, D.G.; Fighetti, C.F.

    1982-08-01

    The phenomenon of multiple CHF events in rod bundle heat transfer tests, referring to the occurrence of CHF on more than one rod or at more than one location on one rod is examined. The adequacy of some of the subchannel CHF correlations presently used in the nuclear industry in predicting higher order CHF events is ascertained based on local coolant conditions obtained with the COBRA IIIC subchannel code. The rod bundle CHF data obtained at the Heat Transfer Research Facility of Columbia University are examined for multiple CHF events using a combination of statistical analyses and parametric studies. The above analyses are applied to the study of three data sets of tests simulating both PWR and BWR reactor cores with uniform and non-uniform axial heat flux distributions. The CHF correlations employed in this study include: (1) CE-1 correlation, (2) B and W-2 correlation, (3) W-3 correlation, and (4) Columbia correlation.

  14. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  15. Analysis of cumulus solar irradiance reflectance (CSIR) events

    NASA Astrophysics Data System (ADS)

    Laird, John L.; Harshvardhan

    Clouds are extremely important with regard to the transfer of solar radiation at Earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When Sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using UVA and UVB pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Win -2 and 0.0169 Wm -2 were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of Sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed. C 1997 Elsevier Science B.V.

  16. Further Evaluation of Antecedent Social Events during Functional Analysis

    ERIC Educational Resources Information Center

    Kuhn, David E.; Hardesty, Samantha L.; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors…

  17. Event-related complexity analysis and its application in the detection of facial attractiveness.

    PubMed

    Deng, Zhidong; Zhang, Zimu

    2014-11-01

    In this study, an event-related complexity (ERC) analysis method is proposed and used to explore the neural correlates of facial attractiveness detection in the context of a cognitive experiment. The ERC method gives a quantitative index for measuring the diverse brain activation properties that represent the neural correlates of event-related responses. This analysis reveals distinct effects of facial attractiveness processing and also provides further information that could not have been achieved from event-related potential alone.

  18. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  19. Analysis and RHBD technique of single event transients in PLLs

    NASA Astrophysics Data System (ADS)

    Zhiwei, Han; Liang, Wang; Suge, Yue; Bing, Han; Shougang, Du

    2015-11-01

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented.

  20. Events in time: Basic analysis of Poisson data

    SciTech Connect

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  1. Observation and Analysis of Jovian and Saturnian Satellite Mutual Events

    NASA Technical Reports Server (NTRS)

    Tholen, David J.

    2001-01-01

    The main goal of this research was to acquire high time resolution photometry of satellite-satellite mutual events during the equatorial plane crossing for Saturn in 1995 and Jupiter in 1997. The data would be used to improve the orbits of the Saturnian satellites to support Cassini mission requirements, and also to monitor the secular acceleration of Io's orbit to compare with heat flow measurements.

  2. Further evaluation of antecedent social events during functional analysis.

    PubMed

    Kuhn, David E; Hardesty, Samantha L; Luczynski, Kevin

    2009-01-01

    The value of a reinforcer may change based on antecedent events, specifically the behavior of others (Bruzek & Thompson, 2007). In the current study, we examined the effects of manipulating the behavior of the therapist on problem behavior while all dimensions of reinforcement were held constant. Both participants' levels of problem behaviors increased as a function of the altered behavior of the therapist without direct manipulation of states of satiation or deprivation.

  3. Applying Association Rule of the Data Mining Method for the Network Event Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Wankyung; Soh, Wooyoung

    2007-12-01

    Network event analysis gives useful information on the network status that helps protect from attacks. It involves finding sets of frequently used packet information such as IP addresses and requires real-time processing by its nature. This paper applies association rules to network event analysis. Originally association rules used for data mining can be applied to find frequent item sets. So, if frequent items occur on networks, information system can guess that there is a threat. But existed association rules such as Apriori algorithm are not suitable for analyzing network events on real-time due to the high usage of CPU and memory and thus low processing speed. This paper develops a network event audit module by applying association rules to network events using a new algorithm instead of Apriori algorithm. Test results show that the application of the new algorithm gives drastically low usage of both CPU and memory for network event analysis compared with existing Apriori algorithm.

  4. Analysis of the September 2010 Los Angeles Extreme Heating Event

    NASA Astrophysics Data System (ADS)

    King, K. C.; Kaplan, M. L.; Smith, C.; Tilley, J.

    2015-12-01

    The Southern California coastal region has a temperate climate, however, there are days with extreme heating where temperatures may reach above 37°C, stressing the region's power grid, leading to health issues, and creating environments susceptible to fires. These extreme localized heating events occur over a short period, from a few hours to one to two days and may or may not occur in conjunction with high winds. The Santa Ana winds are a well-studied example of this type of phenomena. On September 27, 2010, Los Angeles, CA (LA), reached a record maximum temperature of 45°C during an extreme heating event that was not a Santa Ana event. We analyzed the event using observations, reanalysis data, and mesoscale simulations with the Weather Research and Forecasting Model (WRF) to understand the mechanisms of extreme heating and provide guidance on forecasting similar events. On 26 September 2010, a large synoptic ridge overturned and broke over the midwestern United States (US), driving momentum and internal energy to the southwest. A large pool of hot air at mid-levels over the four-corners region also shifted west, moving into southern California by 26 September. This hot air resided over the LA basin, just above the surface, by 00 GMT on 27 September. At this time, the pressure gradient at low levels was weak. Based on WRF model and wind profiler/RASS observations, we propose that separate mountain-plains solenoids (MPS) occurred on both 26 and 27 of September. The MPS on 26 September moved the hot air into place just above the surface over the LA basin. Overnight, the hot air is trapped near the surface due to the action of gravity waves in conjunction with orographic density currents and remnant migrating solenoids that form over the mountains surrounding LA. When the MPS forms during the late morning on the 27th, the descending return branch flow plus surface sensible heating creates a mechanism to move the heat to the surface, leading to record temperatures.

  5. Analysis of Adverse Events in Identifying GPS Human Factors Issues

    NASA Technical Reports Server (NTRS)

    Adams, Catherine A.; Hwoschinsky, Peter V.; Adams, Richard J.

    2004-01-01

    The purpose of this study was to analyze GPS related adverse events such as accidents and incidents (A/I), Aviation Safety Reporting System (ASRS) reports and Pilots Deviations (PDs) to create a framework for developing a human factors risk awareness program. Although the occurrence of directly related GPS accidents is small the frequency of PDs and ASRS reports indicated there is a growing problem with situational awareness in terminal airspace related to different types of GPs operational issues. This paper addresses the findings of the preliminary research and a brief discussion of some of the literature on related GPS and automation issues.

  6. Common-Cause Failure Analysis in Event Assessment

    SciTech Connect

    D. M. Rasmuson; D. L. Kelly

    2008-06-01

    This paper reviews the basic concepts of modelling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group.

  7. Analysis of broadband seismograms from selected IASPEI events

    USGS Publications Warehouse

    Choy, G.L.; Engdahl, E.R.

    1987-01-01

    Broadband seismograms of body waves that are flat to displacement and velocity in the frequency range from 0.01 to 5.0 Hz can now be routinely obtained for most earthquakes of magnitude greater than about 5.5. These records are obtained either directly or through multichannel deconvolution of waveforms from digitally recording seismograph stations. In contrast to data from conventional narrowband seismographs, broadband records have sufficient frequency content to define the source-time functions of body waves, even for shallow events for which the source functions of direct and surface-reflected phases may overlap. Broadband seismograms for selected IASPEI events are systematically analysed to identify depth phases and the presence of subevents. The procedure results in improved estimates of focal depth, identification of subevents in complex earthquakes, and better resolution of focal mechanisms. We propose that it is now possible for reporting agencies, such as the National Earthquake Information Center, to use broadband digital waveforms routinely in the processing of earthquake data. ?? 1987.

  8. Chemical supply chain modeling for analysis of homeland security events

    SciTech Connect

    Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.; Eidson, Eric D.; Jones, Brian S.

    2013-09-06

    The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operations (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.

  9. Analysis of hypoglycemic events using negative binomial models.

    PubMed

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia.

  10. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    SciTech Connect

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  11. Discrete Event Simulation Modeling and Analysis of Key Leader Engagements

    DTIC Science & Technology

    2012-06-01

    Monterey developed the model to support the analysis of civilian population perception based on friendly and threat actions. The current version of...support the analysis of civilian population perception based on friendly and threat actions. The CG Model is built around the concept of reusable...Pll PIICICB PIICICR Pea Pco Poca PICR PLXB p,.. PLXR p,.. Pte• PHca Pteo PHco Regr Part Meta- M odels Trees N MKLf NnG NKL£ N HRM N

  12. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.; Brothers, Alan J.; Jin, Shuangshuang

    2009-09-18

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  13. Chemical supply chain modeling for analysis of homeland security events

    DOE PAGES

    Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.; ...

    2013-09-06

    The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operationsmore » (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.« less

  14. Genome-Wide Analysis of Polyadenylation Events in Schmidtea mediterranea

    PubMed Central

    Lakshmanan, Vairavan; Bansal, Dhiru; Kulkarni, Jahnavi; Poduval, Deepak; Krishna, Srikar; Sasidharan, Vidyanand; Anand, Praveen; Seshasayee, Aswin; Palakodeti, Dasaradhi

    2016-01-01

    In eukaryotes, 3′ untranslated regions (UTRs) play important roles in regulating posttranscriptional gene expression. The 3′UTR is defined by regulated cleavage/polyadenylation of the pre-mRNA. The advent of next-generation sequencing technology has now enabled us to identify these events on a genome-wide scale. In this study, we used poly(A)-position profiling by sequencing (3P-Seq) to capture all poly(A) sites across the genome of the freshwater planarian, Schmidtea mediterranea, an ideal model system for exploring the process of regeneration and stem cell function. We identified the 3′UTRs for ∼14,000 transcripts and thus improved the existing gene annotations. We found 97 transcripts, which are polyadenylated within an internal exon, resulting in the shrinking of the ORF and loss of a predicted protein domain. Around 40% of the transcripts in planaria were alternatively polyadenylated (ApA), resulting either in an altered 3′UTR or a change in coding sequence. We identified specific ApA transcript isoforms that were subjected to miRNA mediated gene regulation using degradome sequencing. In this study, we also confirmed a tissue-specific expression pattern for alternate polyadenylated transcripts. The insights from this study highlight the potential role of ApA in regulating the gene expression essential for planarian regeneration. PMID:27489207

  15. Analysis of sequential events in intestinal absorption of folylpolyglutamate

    SciTech Connect

    Darcy-Vrillon, B.; Selhub, J.; Rosenberg, I.H.

    1988-09-01

    Although it is clear that the intestinal absorption of folylpolyglutamates is associated with hydrolysis to monoglutamyl folate, the precise sequence and relative velocity of the events involved in this absorption are not fully elucidated. In the present study, we used biosynthetic, radiolabeled folylpolyglutamates purified by affinity chromatography to analyze the relationship of hydrolysis and transport in rat jejunal loops in vivo. Absorption was best described by a series of first-order processes: luminal hydrolysis to monoglutamyl folate followed by tissue uptake of the product. The rate of hydrolysis in vivo was twice as high as the rate of transport. The latter value was identical to that measured for folic acid administered separately. The relevance of this sequential model was confirmed by data obtained using inhibitors of the individual steps in absorption of ''natural'' folate. Heparin and sulfasalazine were both effective in decreasing absorption. The former affected hydrolysis solely, whereas the latter acted as a competitive inhibitor of transport of monoglutamyl folate. These studies confirm that hydrolysis is obligatory and that the product is subsequently taken up by a transport process, common to monoglutamyl folates, that is the rate-determining step in transepithelial absorption.

  16. Twelve Tips for Promoting Significant Event Analysis To Enhance Reflection in Undergraduate Medical Students.

    ERIC Educational Resources Information Center

    Henderson, Emma; Berlin, Anita; Freeman, George; Fuller, Jon

    2002-01-01

    Points out the importance of the facilitation of reflection and development of reflective abilities in professional development and describes 12 tips for undergraduate medical students to increase their abilities of writing reflective and creative event analysis. (Author/YDS)

  17. An analysis of fog events at Belgrade International Airport

    NASA Astrophysics Data System (ADS)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  18. Subjective well-being and adaptation to life events: a meta-analysis.

    PubMed

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E

    2012-03-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on affective and cognitive well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to 4 family events (marriage, divorce, bereavement, childbirth) and 4 work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given.

  19. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study.

  20. Spectral analysis of snoring events from an Emfit mattress.

    PubMed

    Perez-Macias, Jose Maria; Viik, Jari; Varri, Alpo; Himanen, Sari-Leena; Tenhunen, Mirja

    2016-12-01

    The aim of this study is to explore the capability of an Emfit (electromechanical film transducer) mattress to detect snoring (SN) by analyzing the spectral differences between normal breathing (NB) and SN. Episodes of representative NB and SN of a maximum of 10 min were visually selected for analysis from 33 subjects. To define the bands of interest, we studied the statistical differences in the power spectral density (PSD) between both breathing types. Three bands were selected for further analysis: 6-16 Hz (BW1), 16-30 Hz (BW2) and 60-100 Hz (BW3). We characterized the differences between NB and SN periods in these bands using a set of spectral features estimated from the PSD. We found that 15 out of the 29 features reached statistical significance with the Mann-Whitney U-test. Diagnostic properties for each feature were assessed using receiver operating characteristic analysis. According to our results, the highest diagnostic performance was achieved using the power ratio between BW2 and BW3 (0.85 area under the receiver operating curve, 80% sensitivity, 80% specificity and 80% accuracy). We found that there are significant differences in the defined bands between the NB and SN periods. A peak was found in BW3 for SN epochs, which was best detected using power ratios. Our work suggests that it is possible to detect snoring with an Emfit mattress. The mattress-type movement sensors are inexpensive and unobtrusive, and thus provide an interesting tool for sleep research.

  1. Analysis of upwelling event in Southern Makassar Strait

    NASA Astrophysics Data System (ADS)

    Utama, F. G.; Atmadipoera, A. S.; Purba, M.; Sudjono, E. H.; Zuraida, R.

    2017-01-01

    The southeast monsoon (SEM) winds which blow in southern Makassar Strait, generate the coastal upwelling phenomenon. The wind data for one year, which is equipped with CTD data from MAJAFLOX cruise results, is used to analyze the phenomenon of upwelling in this region. During the SEM 2015 occurrence, the southeasterly winds speed were in average of 6 m/s, while the highest speed appeared in August and September. Using the Ekman theory’s analysis of upwelling during this monsoon period, we could estimate the Ekman transport was about 8.50 m2/s toward offshore (to the Southwest direction); the upwelled water, occurred from deeper layer, started from the coastal area with vertical velocity was about 6.87 x 10-5 m/s – 7.84 x 10-5 m/s; and The Ekman layer depth in the upwelling region was approximately 60 m and these were good agreement with CTD observation result.

  2. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  3. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    NASA Astrophysics Data System (ADS)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  4. Analysis of single events in ultrarelativistic nuclear collisions: A new method to search for critical fluctuations

    SciTech Connect

    Stock, R.

    1995-07-15

    The upcoming generation of experiments with ultrarelativistic heavy nuclear projectiles, at the CERN SPS and at RHIC and LHC, will confront researchers with several thousand identified hadrons per event, suitable detectors provided. An analysis of individual events becomes meaningful concerning a multitude of hadronic signals thought to reveal a transient deconfinement phase transition, or the related critical precursor fluctuations. Transverse momentum spectra, the kaon to pion ratio, and pionic Bose-Einstein correlation are examined, showing how to separate the extreme, probably rare candidate events from the bulk of average events. This type of observables can already be investigated with the Pb beam of the SPS. The author then discusses single event signals that add to the above at RHIC and LHC energies, kaon interferometry, rapidity fluctuation, jet and {gamma} production.

  5. Catchment process affecting drinking water quality, including the significance of rainfall events, using factor analysis and event mean concentrations.

    PubMed

    Cinque, Kathy; Jayasuriya, Niranjali

    2010-12-01

    To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.

  6. Stochastic Generation of Drought Events using Reconstructed Annual Streamflow Time Series from Tree Ring Analysis

    NASA Astrophysics Data System (ADS)

    Lopes, A.; Dracup, J. A.

    2011-12-01

    The statistical analysis of multiyear drought events in streamflow records is often restricted by the size of samples since only a few number of droughts events can be extracted from common river flow time series data. An alternative to those conventional datasets is the use of paleo hydrologic data such as streamflow time series reconstructed from tree ring analysis. In this study, we analyze the statistical characteristics of drought events present in a 1439 year long time series of reconstructed annual streamflow records at the Feather river inflow to the Oreville reservoir, California. Also, probabilistic distributions were used to describe duration and severity of drought events and the results were compared with previous studies that used only the observed streamflow data. Finally, a stochastic simulation model was developed to synthetically generate sequences of drought and high-flow events with the same characteristics of the paleo hydrologic record. The long term mean flow was used as the single truncation level to define 248 drought events and 248 high flow events with specific duration and severity. The longest drought and high flow events had 13 years (1471 to 1483) and 9 years of duration (1903 to 1911), respectively. A strong relationship between event duration and severity in both drought and high flow events were found so the longest droughts also corresponded to the more severe ones. Therefore, the events were classified by duration (in years) and probability distributions were fitted to the frequency distribution of drought and high flow severity for each duration. As a result, it was found that the gamma distribution describes well the frequency distribution of drought severities for all durations. For high flow events, the exponential distribution is more adequate for one year events while the gamma distribution is better suited for the longer events. Those distributions can be used to estimate the recurrence time of drought events according to

  7. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  8. An Application of Fuzzy Fault Tree Analysis to Uncontained Events of an Areo-Engine Rotor

    NASA Astrophysics Data System (ADS)

    Li, Yanfeng; Huang, Hong-Zhong; Zhu, Shun-Peng; Liu, Yu; Xiao, Ning-Cong

    2012-12-01

    Fault tree analysis is an important tool for system reliability analysis. Fuzzy fault tree analysis of uncontained events for aero-engine rotor is performed in this article. In addition, a new methodology based on fuzzy set theory is also used in fault tree analysis to quantify the failure probabilities of basic events. The theory of fuzzy fault tree is introduced firstly. Then the fault tree for uncontained events of an aero-engine rotor is established, in which the descending method is used to determine the minimal cut sets. Furthermore, the interval representation and calculation strategy is presented by using the symmetrical L-R type fuzzy number to describe the failure probability, and the resulting fault tree is analyzed quantitatively in the case study.

  9. [Analysis of the impact of two typical air pollution events on the air quality of Nanjing].

    PubMed

    Wang, Fei; Zhu, Bin; Kang, Han-Qing; Gao, Jin-Hui; Wang, Yin; Jiang, Qi

    2012-10-01

    Nanjing and the surrounding area have experienced two consecutive serious air pollution events from late October to early November in 2009. The first event was long-lasting haze pollution, and the second event was resulted from the mixed impact of crop residue burning and local transportation. The effects of regional transport and local sources on the two events were discussed by cluster analysis, using surface meteorological observations, air pollution index, satellite remote sensing of fire hot spots data and back trajectory model. The results showed that the accumulation-mode aerosol number concentrations were higher than those of any other aerosol modes in the two pollution processes. The peak value of aerosol particle number concentrations shifted to large particle size compare with the previous studies in this area. The ratio of SO4(2-)/NO3(-) was 1.30 and 0.99, indicating that stationary sources were more important than traffic sources in the first event and the reverse in the second event. Affected by the local sources from east and south, the particle counts below 0.1 microm gradually accumulated in the first event. The second event was mainly affected by a short-distance transport from northeast and local sources from southwest, especially south, the concentration of aerosol particles was higher than those in other directions, indicating that the sources of crop residue burning were mainly in this direction.

  10. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.

  11. Single-Event Correlation Analysis of Quantum Key Distribution with Single-Photon Sources

    NASA Astrophysics Data System (ADS)

    Shangli Dong,; Xiaobo Wang,; Guofeng Zhang,; Liantuan Xiao,; Suotang Jia,

    2010-04-01

    Multiple photons exist that allow efficient eavesdropping strategies that threaten the security of quantum key distribution. In this paper, we theoretically discuss the photon correlations between authorized partners in the case of practical single-photon sources including a multiple-photon background. To investigate the feasibility of intercept-resend attacks, the cross correlations and the maximum intercept-resend ratio caused by the background signal are determined using single-event correlation analysis based on single-event detection.

  12. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  13. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, M.; Meier, T. M.; Becker, D.; Brüstle, A.

    2015-12-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The events were recorded by the temporary seismic network EGELADOS deployed from September 2005 to March 2007. The network covered the entire Hellenic subduction zone and it consisted of 23 offshore and 56 onshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. The cluster of intermediate deep seismicity consists of 159 events with local magnitudes ranging from magnitude 0.2 to magnitude 4.1 at depths from 80 to 200 km. The events occur close to the top of the slab at an about 30 km thick zone. The spatio-temporal clustering is studied using three component similarity analysis.Single event locations obtained using the nonlinear location tool NonLinLoc are compared to relative relocations calculated using the double-difference earthquake relocation software HypoDD. The relocation is performed with both manual readings of onset times as well as with differential traveltimes obtained by separate cross-correlation of P- and S-waveforms. The three-component waveform cross-correlation was performed for all the events using data from 45 stations. The results of the similarity analysis are shown as a function of frequency for individual stations and averaged over the network. Average similarities between waveforms of all event pairs reveal a low number of highly similar events but a large number of moderate similarities. Interestingly, the single station similarities between the event pairs show (1) in general decreasing similarity with increasing epicentral distance, (2) reduced similarities for paths crossing boundaries of slab segments, and (3) the influence of strong local heterogeneity leading to a considerable reduction of waveform similarities e.g. in the center of the Santorini volcano.

  14. Similarity and Cluster Analysis of Intermediate Deep Events in the Southeastern Aegean

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Brüstle, Andrea; Meier, Thomas

    2016-04-01

    In order to gain a better understanding of geodynamic processes in the Hellenic subduction zone (HSZ), in particular in the eastern part of the HSZ, we analyze a cluster of intermediate deep events in the region of Nisyros volcano. The cluster recorded during the deployment of the temporary seismic network EGELADOS consists of 159 events at 80 to 200 km depth with local magnitudes ranging from magnitude 0.2 to magnitude 4.1. The network itself consisted of 56 onshore and 23 offshore broadband stations completed by 19 permanent stations from NOA, GEOFON and MedNet. It was deployed from September 2005 to March 2007 and it covered the entire HSZ. Here, both spatial and temporal clustering of the recorded events is studied by using the three component similarity analysis. The waveform cross-correlation was performed for all event combinations using data recorded on 45 onshore stations. The results are shown as a function of frequency for individual stations and as averaged values over the network. The cross-correlation coefficients at the single stations show a decreasing similarity with increasing epicentral distance as well as the effect of local heterogeneities at particular stations, causing noticeable differences in waveform similarities. Event relocation was performed by using the double-difference earthquake relocation software HypoDD and the results are compared with previously obtained single event locations which were calculated using nonlinear location tool NonLinLoc and station corrections. For the relocation, both differential travel times obtained by separate cross-correlation of P- and S-waveforms and manual readings of onset times are used. It is shown that after the relocation the inter-event distance for highly similar events has been reduced. By comparing the results of the cluster analysis with results obtained from the synthetic catalogs, where the event rate, portion and occurrence time of the aftershocks is varied, it is shown that the event

  15. Statistical analysis of solar energetic particle events and related solar activity

    NASA Astrophysics Data System (ADS)

    Dierckxsens, Mark; Patsou, Ioanna; Tziotziou, Kostas; Marsh, Michael; Lygeros, Nik; Crosby, Norma; Dalla, Silvia; Malandraki, Olga

    2013-04-01

    The FP7 COMESEP (COronal Mass Ejections and Solar Energetic Particles: forecasting the space weather impact) project is developing tools for forecasting geomagnetic storms and solar energetic particle (SEP) radiation storms. Here we present preliminary results on a statistical analysis of SEP events and their parent solar activity during Solar Cycle 23. The work aims to identify correlations between solar events and SEP events relevant for space weather, as well as to quantify SEP event probabilities for use within the COMESEP alert system. The data sample covers the SOHO era and is based on the SEPEM reference event list [http://dev.sepem.oma.be/]. Events are subdivided if separate enhancements are observed in higher energy channels as defined for the list of Cane et al (2010). Energetic Storm Particle (ESP) enhancements during these events are identified by associating ESP-like increases in the proton channels with shocks detected in ACE and WIND data. Their contribution has been estimated and subtracted from the proton fluxes. Relationships are investigated between solar flare parameters such as X-ray intensity and heliographic location on the one hand, and the probability of occurrence and strength of energetic proton flux increases on the other hand. The same exercise is performed using the velocity and width of coronal mass ejections to examine their SEP productiveness. Relationships between solar event characteristics and SEP event spectral indices and fluences are also studied, as well as enhancements in heavy ion fluxes measured by the SIS instrument on board the ACE spacecraft during the same event periods. This work has received funding from the European Commission FP7 Project COMESEP (263252).

  16. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  17. Regression analysis of mixed panel count data with dependent terminal events.

    PubMed

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Determination of microseismic event azimuth from S-wave splitting analysis

    NASA Astrophysics Data System (ADS)

    Yuan, Duo; Li, Aibing

    2017-02-01

    P-wave hodogram analysis has been the only reliable method to obtain microseismic event azimuths for one-well monitoring. However, microseismic data usually have weak or even no P-waves due to near double-couple focal mechanisms and limited ray path coverage, which causes large uncertainties in determined azimuths and event locations. To solve this problem, we take advantage of S-waves, which are often much stronger than P waves in microseismic data, and determine event azimuths by analyzing S-wave splitting data. This approach utilizes the positive correlation between the accuracy of event azimuth and the effectiveness of measuring S-wave splitting parameters and finds the optimal azimuth through a grid search. We have demonstrated that event azimuths can be well constrained from S-wave splitting analysis using both synthetic and field microseismic data. This method is less sensitive to noise than the routine P-wave hodogram method and provides a new way of determining microseismic event azimuths.

  19. Complete dose analysis of the November 12, 1960 solar cosmic ray event.

    PubMed

    Masley, A J; Goedeke, A D

    1963-01-01

    A detailed analysis of the November 12, 1960 solar cosmic ray event is presented as an integrated space flux and dose. This event is probably the most interesting solar cosmic ray event studied to date. Direct measurements were made of solar protons from 10 MeV to 6 GeV. During the double peaked high energy part of the event evidence is presented for the trapping of relativistic particles in a magnetic cloud. The proton energy spectrum is divided into 3 energy intervals, with separate energy power law exponents and time profiles carried through for each. The three groups are: (1) (30analysis are the results of rocket measurements which determined the spectrum down to 10 MeV twice during the event, balloon results from Fort Churchill and Minneapolis, earth satellite measurements, neutron monitors in New Hampshire and at both the North and South Pole and riometer results from Alaska and Kiruna, Sweden. The results are given in Table 1 [see text]. The results of our analyses of other solar cosmic ray events are also included with a general discussion of the solar flare hazards in space.

  20. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    SciTech Connect

    Lisbeth A. Mitchell

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  1. Analysis of Pressurized Water Reactor Primary Coolant Leak Events Caused by Thermal Fatigue

    SciTech Connect

    Atwood, Corwin Lee; Shah, Vikram Naginbhai; Galyean, William Jospeh

    1999-09-01

    We present statistical analyses of pressurized water reactor (PWR) primary coolant leak events caused by thermal fatigue, and discuss their safety significance. Our worldwide data contain 13 leak events (through-wall cracking) in 3509 reactor-years, all in stainless steel piping with diameter less than 25 cm. Several types of data analysis show that the frequency of leak events (events per reactor-year) is increasing with plant age, and the increase is statistically significant. When an exponential trend model is assumed, the leak frequency is estimated to double every 8 years of reactor age, although this result should not be extrapolated to plants much older than 25 years. Difficulties in arresting this increase include lack of quantitative understanding of the phenomena causing thermal fatigue, lack of understanding of crack growth, and difficulty in detecting existing cracks.

  2. FFTF Event Fact Sheet root cause analysis calendar year 1985 through 1988

    SciTech Connect

    Griffin, G.B.

    1988-12-01

    The Event Fact Sheets written from January 1985 through mid August 1988 were reviewed to determine their root causes. The review group represented many of the technical disciplines present in plant operation. The review was initiated as an internal critique aimed at maximizing the ``lessons learned`` from the event reporting system. The root causes were subjected to a Pareto analysis to determine the significant causal factor groups. Recommendations for correction of the high frequency causal factors were then developed and presented to the FFTF Plant management. In general, the distributions of the causal factors were found to closely follow the industry averages. The impacts of the events were also studied and it was determined that we generally report events of a level of severity below that of the available studies. Therefore it is concluded that the recommendations for corrective action are ones to improve the overall quality of operations and not to correct significant operational deficiencies. 17 figs.

  3. Dispelling Illusions of Reflection: A New Analysis of the 2007 May 19 Coronal "Wave" Event

    NASA Astrophysics Data System (ADS)

    Attrill, Gemma D. R.

    2010-07-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified "reflections" are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  4. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    SciTech Connect

    Attrill, Gemma D. R.

    2010-07-20

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  5. Brain Network Activation Analysis Utilizing Spatiotemporal Features for Event Related Potentials Classification

    PubMed Central

    Stern, Yaki; Reches, Amit; Geva, Amir B.

    2016-01-01

    The purpose of this study was to introduce an improved tool for automated classification of event-related potentials (ERPs) using spatiotemporally parcellated events incorporated into a functional brain network activation (BNA) analysis. The auditory oddball ERP paradigm was selected to demonstrate and evaluate the improved tool. Methods: The ERPs of each subject were decomposed into major dynamic spatiotemporal events. Then, a set of spatiotemporal events representing the group was generated by aligning and clustering the spatiotemporal events of all individual subjects. The temporal relationship between the common group events generated a network, which is the spatiotemporal reference BNA model. Scores were derived by comparing each subject's spatiotemporal events to the reference BNA model and were then entered into a support vector machine classifier to classify subjects into relevant subgroups. The reliability of the BNA scores (test-retest repeatability using intraclass correlation) and their utility as a classification tool were examined in the context of Target-Novel classification. Results: BNA intraclass correlation values of repeatability ranged between 0.51 and 0.82 for the known ERP components N100, P200, and P300. Classification accuracy was high when the trained data were validated on the same subjects for different visits (AUCs 0.93 and 0.95). The classification accuracy remained high for a test group recorded at a different clinical center with a different recording system (AUCs 0.81, 0.85 for 2 visits). Conclusion: The improved spatiotemporal BNA analysis demonstrates high classification accuracy. The BNA analysis method holds promise as a tool for diagnosis, follow-up and drug development associated with different neurological conditions. PMID:28066224

  6. An Analysis of the Muon-Like Events as the Fully Contained Events in the Super-Kamiokande through the Computer Numerical Experiment

    NASA Astrophysics Data System (ADS)

    Konishi, E.; Minorikawa, Y.; Galkin, V.I.; Ishiwata, M.; Nakamura, I.; Takahashi, N.; Kato, M.; Misaki, A.

    We analyze the muon-like Events(single ring image ) in the Super-Kamiokande (SK) by the Computer Numerical Experiment. Assuming the parameters of the neutrino oscillation obtained by the SK which characterize the type of the neutrino oscillation, we reproduce the zenith angle distribution of the muon-like events and compare it with the real distribution obtained by the SK . Also, we carry out the L/E analysis of the muon-like events by the Computer Numerical Experiment and compare it with that by the SK.

  7. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  8. Pitfalls in Pathways: Some Perspectives on Competing Risks Event History Analysis in Education Research

    ERIC Educational Resources Information Center

    Scott, Marc A.; Kennedy, Benjamin B.

    2005-01-01

    A set of discrete-time methods for competing risks event history analysis is presented. The approach used is accessible to the practitioner and the article describes the strengths, weaknesses, and interpretation of both exploratory and model-based tools. These techniques are applied to the impact of "nontraditional" enrollment features (working,…

  9. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  10. Uniting Secondary and Postsecondary Education: An Event History Analysis of State Adoption of Dual Enrollment Policies

    ERIC Educational Resources Information Center

    Mokher, Christine G.; McLendon, Michael K.

    2009-01-01

    This study, as the first empirical test of P-16 policy antecedents, reports the findings from an event history analysis of the origins of state dual enrollment policies adopted between 1976 and 2005. First, what characteristics of states are associated with the adoption of these policies? Second, to what extent do conventional theories on policy…

  11. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  12. Rare events analysis of temperature chaos in the Sherrington-Kirkpatrick model

    NASA Astrophysics Data System (ADS)

    Billoire, Alain

    2014-04-01

    We investigate the question of temperature chaos in the Sherrington-Kirkpatrick spin glass model, applying a recently proposed rare events based data analysis method to existing Monte Carlo data. Thanks to this new method, temperature chaos is now observable for this model, even with the limited size systems that can currently be simulated.

  13. Adverse events with bismuth salts for Helicobacter pylori eradication: Systematic review and meta-analysis

    PubMed Central

    Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul

    2008-01-01

    AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870

  14. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data

    PubMed Central

    Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-01-01

    Background Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. Objectives The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Methods Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. Results In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (P<.001) when compared with the lowest incidence in the 20-29 years-old group. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older

  15. The Time-Scaling Issue in the Frequency Analysis of Multidimensional Extreme Events

    NASA Astrophysics Data System (ADS)

    Gonzalez, J.; Valdes, J. B.

    2004-05-01

    Extreme events, such as droughts, appear as a period of time where water availability differ exceptionally from normal condition. Several characteristic of this departure from the normality are important in analyzing droughts recurrence frequency (e.g. magnitude, maximum intensity, duration, severity,.). In this kind of problems, the time scale applied in the analyses may become an issue when applying conventional frequency analysis approaches, generally based on the run theory. Usually few (one or two) main event-characteristics may be used, and when the time-scale changes in orders of magnitude, the derived frequency significantly changes, so poor characterization is achieved. For example, sort time-scale empathies characteristic such as intensity, but long time scale does magnitude. That variability may be overcome using a new approach, where events are threatened as in-time-multidimensional. This is studied in this work by comparing analysis applying conventional approach and the new multidimensional approach, and using from daily to decadal time scale. The improve in the performance of applying multidimensional technique, whit which frequency remains characterized even using different time-scale order of magnitude, results the main outcome of the study. The ability of implicitly incorporate all event feature in the time distribution, made possible characterize the events, independently of the time-scale, if the scale does not hide the extreme features.

  16. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    SciTech Connect

    Lu, Shuai; Makarov, Yuri V.

    2009-04-01

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on February 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.

  17. Analysis of the longitudinal dependence of the downstream fluence of large solar energetic proton events

    NASA Astrophysics Data System (ADS)

    Pacheco, Daniel; Sanahuja, Blai; Aran, Angels; Agueda, Neus; Jiggens, Piers

    2016-07-01

    Simulations of the solar energetic particle (SEP) intensity-time profiles are needed to estimate the radiation environment for interplanetary missions. At present, the physics-based models applied for such a purpose, and including a moving source of particles, are not able to model the portion of the SEP intensity enhancement occurring after the coronal/interplanetary shock crossing by the observer (a.k.a. the downstream region). This is the case, for example, of the shock-and-particle model used to build the SOLPENCO2 code. SOLPENCO2 provides the statistical modelling tool developed in the ESA/SEPEM project for interplanetary missions with synthetic SEP event simulations for virtual spacecraft located at heliocentric distances between 0.2 AU and 1.6 AU (http://dev.sepem.oma.be/). In this work we present an analysis of 168 individual SEP events observed at 1 AU from 1988 to 2013. We identify the solar eruptive phenomena associated with these SEP events, as well as the in-situ passage of interplanetary shocks. For each event, we quantify the amount of fluence accounted in the downstream region, i.e. after the passage of the shock, at the 11 SEPEM reference energy channels (i.e., from 5 to 300 MeV protons). First, from the subset of SEP events simultaneously detected by near Earth spacecraft (using SEPEM reference data) and by one of the STEREO spacecraft, we select those events for which the downstream region can be clearly determined. From the 8 selected multi-spacecraft events, we find that the western observations of each event have a minor downstream contribution than their eastern counterpart, and that the downstream-to-total fluence ratio of these events decreases as a function of the energy. Hence, there is a variation of the downstream fluence with the heliolongitude in SEP events. Based on this result, we study the variation of the downstream-to-total fluence ratios of the total set of individual events. We confirm the eastern-to-western decrease of the

  18. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  19. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  20. The use of significant event analysis and personal development plans in developing CPD: a pilot study.

    PubMed

    Wright, P D; Franklin, C D

    2007-07-14

    This paper describes the work undertaken by the Postgraduate Primary Care Trust (PCT) Dental Tutor for South Yorkshire and East Midlands Regional Postgraduate Dental Education Office during the first year of a two-year pilot. The tutor has special responsibility for facilitating the writing of Personal Development Plans (PDPs) and the introduction of Significant Event Analysis to the 202 general dental practitioners in the four Sheffield PCTs. Data were collected on significant events and the educational needs highlighted as a result. A hands-on workshop format was used in small practice groups and 45% of Sheffield general dental practitioners now have written PDPs compared with a 16% national average. A library of significant events has also been collated from the data collected.

  1. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  2. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    PubMed

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events.

  3. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  4. A Unified Scenario of Near-Earth Substorm Onset: Analysis of THEMIS Events

    NASA Astrophysics Data System (ADS)

    Zhu, P.; Raeder, J.; Bhattacharjee, A.; Germaschewski, K.; Hegna, C.

    2008-12-01

    We propose an alternative scenario for the substorm onset process, based on ideal ballooning stability analysis of the near-Earth plasma sheet during recent THEMIS substorm events. In this scenario, the ballooning instability is initiated by the magnetic reconnection in the near-Earth plasma sheet, which in turn directly contributes to the trigger of a full onset. Using the solar wind data from WIND satellite observation for the substorm event as an input at dayside, we reconstructed a sequence of global magnetospheric configurations around the substorm onset by means of OpenGGCM simulation. These simulations have reproduced most of the salient features, including the onset timing, observed in the THEMIS substorm events [Raeder et al, 2008]. The ballooning instability criterion and growth rate are evaluated for the near-Earth plasma sheet region where the configuration satisfies a quasi-static equilibrium condition. Our analysis of the evolution of the near-Earth magnetotail region during the substorm events reveals a correlation between the breaching of the ballooning stability condition and the substorm onset in both temporal and spatial domains. The analysis suggests that the Earthward bulk plasma flow induced by the reconnection event in the near- Earth plasma sheet, leads to the pressure build-up and creates a favorable condition for the initiation of the ballooning instability in that same region. This new alternative scenario further elaborates earlier conjectures on the roles of reconnection and ballooning instability [Bhattacharjee et al, 1998], and has the potential to integrate both the near-Earth neutral-line model [McPherron et al, 1973] and the near-Earth current-sheet- disruption model [Lui et al, 1988] into a unified model of the near-Earth substorm onset. Research supported by U.S. NSF Grant No. ATM-0542954.

  5. The association between B vitamins supplementation and adverse cardiovascular events: a meta-analysis

    PubMed Central

    Li, Wen-Feng; Zhang, Dan-Dan; Xia, Ji-Tian; Wen, Shan-Fan; Guo, Jun; Li, Zi-Cheng

    2014-01-01

    This study is to explore the association of adverse cardiovascular events with B vitamins supplementation. Rev.Man 5.1 and Stata 11.0 software were applied for the meta-analysis. The number of cardiovascular events was collected and calculated using indicates of odds ratio and 95% confidence intervals in a fixed-effects or a random-effects model when appropriate. The study includes 15 studies which consists of 37,358 study objects (experimental group: 19,601; control group: 17,757). This study showed that the pooled ORs was 1.01 (95% CI = 0.96~1.06, P > 0.05) for objects with Experimental group (B vitamins supplementation) vs. Control group (placebo or regular treatment), which suggests no significant differences were found in the overall effect of the number of cardiovascular events between the two groups. Further stratification of subgroup analysis indicates no significant differences were found between the two groups as well. There were also no publication bias existing by the Egger’s linear regression test (P > 0.05). Our result indicates that the number of cardiovascular events in experimental group using B vitamins supplementation during the treatment is equal to placebo or regular treatment group thus further studies is necessary. PMID:25232372

  6. Fixation based event-related fmri analysis: using eye fixations as events in functional magnetic resonance imaging to reveal cortical processing during the free exploration of visual images.

    PubMed

    Marsman, Jan Bernard C; Renken, Remco; Velichkovsky, Boris M; Hooymans, Johanna M M; Cornelissen, Frans W

    2012-02-01

    Eye movements, comprising predominantly fixations and saccades, are known to reveal information about perception and cognition, and they provide an explicit measure of attention. Nevertheless, fixations have not been considered as events in the analyses of data obtained during functional magnetic resonance imaging (fMRI) experiments. Most likely, this is due to their brevity and statistical properties. Despite these limitations, we used fixations as events to model brain activation in a free viewing experiment with standard fMRI scanning parameters. First, we found that fixations on different objects in different task contexts resulted in distinct cortical patterns of activation. Second, using multivariate pattern analysis, we showed that the BOLD signal revealed meaningful information about the task context of individual fixations and about the object being inspected during these fixations. We conclude that fixation-based event-related (FIBER) fMRI analysis creates new pathways for studying human brain function by enabling researchers to explore natural viewing behavior.

  7. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  8. Identification and Analysis of Storm Tracks Associated with Extreme Flood Events in Southeast and South Brazil

    NASA Astrophysics Data System (ADS)

    Lima, Carlos; Lopes, Camila

    2015-04-01

    Flood is the main natural disaster in Brazil, practically affecting all regions in the country and causing several economical damages and losses of lives. In traditional hydrology, the study of floods is focused on a frequency analysis of the extreme events and on the fit of statistical models to define flood quantiles associated with pre-specified return periods or exceedance probabilities. The basic assumptions are randomness and temporal stationarity of the streamflow data. In this paper we seek to advance the traditional flood frequency studies by using the ideas developed in the area of flood hydroclimatology, which is defined as the study of climate in the flood framework, i.e., the understanding of long term changes in the frequency, magnitude, duration, location and seasonality of floods as driven by the interaction of regional and global patterns of the ocean and atmospheric circulation. That being said, flood events are not treated as random and stationary but resulting from a causal chain, where exceptional floods in water basins from different sizes are related with large scale anomalies in the atmospheric and ocean circulation patterns. Hence, such studies enrich the classical assumption of stationary flood hazard adopted in most flood frequency studies through a formal consideration of the physical mechanisms responsible for the generation of extreme floods, which implies recognizing the natural climate variability due to persistent and oscillatory regimes (e.g. ENSO, NAO, PDO) in many temporal scales (interannual, decadal, etc), and climate fluctuations in response to anthropogenic changes in the atmosphere, soil use and vegetation cover. Under this framework and based on streamflow gauge and reanalysis data, we identify and analyze here the storm tracks that preceded extreme events of floods in key flood-prone regions of the country (e.g. Parana and Rio Doce River basins) with such events defined based on the magnitude, duration and volume of the

  9. Contextual determinants of condom use among female sex exchangers in East Harlem, NYC: an event analysis.

    PubMed

    McMahon, James M; Tortu, Stephanie; Pouget, Enrique R; Hamid, Rahul; Neaigus, Alan

    2006-11-01

    Recent studies have revealed a variety of contexts involving HIV risk behaviors among women who exchange sex for money or drugs. Event analysis was used to identify the individual, relationship, and contextual factors that contribute to these high-risk sex exchange practices. Analyses were conducted on data obtained from 155 drug-using women who reported details of their most recent sex exchange event with male clients. The majority of sex exchange encounters (78%) involved consistent condom use. In multivariable analysis, protective behavior was associated primarily with situational and relationship variables, such as exchange location, substance use, sexual practices, and respondent/client discussion and control. In order to inform HIV prevention programs targeted to women sex exchangers, further research is needed on the contextual determinants of risk, especially with regard to condom-use negotiation and factors involving substance use that adversely affect women's ability to manage protective behavior in the context of sex exchange.

  10. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2016-06-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  11. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2017-04-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  12. A Bayesian Approach for Instrumental Variable Analysis with Censored Time-to-Event Outcome

    PubMed Central

    Li, Gang; Lu, Xuyang

    2014-01-01

    Instrumental variable (IV) analysis has been widely used in economics, epidemiology, and other fields to estimate the causal effects of covariates on outcomes, in the presence of unobserved confounders and/or measurement errors in covariates. However, IV methods for time-to-event outcome with censored data remain underdeveloped. This paper proposes a Bayesian approach for IV analysis with censored time-to-event outcome by using a two-stage linear model. A Markov Chain Monte Carlo sampling method is developed for parameter estimation for both normal and non-normal linear models with elliptically contoured error distributions. Performance of our method is examined by simulation studies. Our method largely reduces bias and greatly improves coverage probability of the estimated causal effect, compared to the method that ignores the unobserved confounders and measurement errors. We illustrate our method on the Women's Health Initiative Observational Study and the Atherosclerosis Risk in Communities Study. PMID:25393617

  13. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    DTIC Science & Technology

    2005-05-01

    mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for...authors present a protocol for applying a mixed methods approach to the study of patient safety reporting data to inform the development of interventions...Using mixed methods to study patient safety is an effective and efficient approach to data analysis that provides both information and motivation for developing and implementing patient safety

  14. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis.

  15. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    DTIC Science & Technology

    2013-06-01

    critical events. In a detailed analysis of the email corpus of the Enron Corporation, Diesner and Carley (2005; see also Murshed et al. 2007) found that...established contacts and formal roles. The Enron crisis is instructive as a network with a critical period of failure. Other researchers have also found...Diesner, J., Frantz, T. L., & Carley, K. M. (2005). Communication networks from the Enron email corpus “It’s always about the people. Enron is no

  16. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    PubMed Central

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

    2013-01-01

    Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

  17. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  18. Analysis of 415 adverse events in dental practice in Spain from 2000 to 2010

    PubMed Central

    Perea-Pérez, Bernardo; Labajo-González, Elena; Santiago-Sáez, Andrés; Albarrán-Juan, Elena; Villa-Vigil, Alfonso

    2014-01-01

    Introduction: The effort to increase patient safety has become one of the main focal points of all health care professions, despite the fact that, in the field of dentistry, initiatives have come late and been less ambitious. The main objective of patient safety is to avoid preventable adverse events to the greatest extent possible and to limit the negative consequences of those which are unpreventable. Therefore, it is essential to ascertain what adverse events occur in each dental care activity in order to study them in-depth and propose measures for prevention. Objectives: To ascertain the characteristics of the adverse events which originate from dental care, to classify them in accordance with type and origin, to determine their causes and consequences, and to detect the factors which facilitated their occurrence. Material and Methods: This study includes the general data from the series of adverse dental vents of the Spanish Observatory for Dental Patient Safety (OESPO) after the study and analysis of 4,149 legal claims (both in and out of court) based on dental malpractice from the years of 2000 to 2010 in Spain. Results: Implant treatments, endodontics and oral surgery display the highest frequencies of adverse events in this series (25.5%, 20.7% and 20.4% respectively). Likewise, according to the results, up to 44.3% of the adverse events which took place were due to predictable and preventable errors and complications. Conclusions: A very significant percentage were due to foreseeable and preventable errors and complications that should not have occurred. Key words:Patient safety, adverse event, medical care risk, dentistry. PMID:24880444

  19. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    SciTech Connect

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.

  20. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  1. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  2. Analysis and modeling of a hail event consequences on a building portfolio

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  3. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  4. Grieving experiences amongst adolescents orphaned by AIDS: Analysis from event history calendars.

    PubMed

    Thupayagale-Tshweneagae, Gloria

    2012-09-07

    Mental health is an essential component of adolescent health and wellbeing. Mental health practitioners assess adolescents' mental health status to identify possible issues that may lead to mental health problems. However, very few of the tools used to assess the mental health status of adolescents include assessment for grieving and coping patterns. The current tools used for assessing an individual's mental health are lengthy and not comprehensive. The purpose of this study was to assess grieving patterns of adolescents orphaned by AIDS and to appraise the usefulness of an event history calendar as an assessment tool for identifying grieving experiences, in order to guide and support these adolescents through the grieving process. One hundred and two adolescents aged 14-18 years, who had been orphaned by AIDS, completed an event history calendar, reviewed it with the researcher and reported their perceptions of it. Thematic analysis of the event history calendar content revealed that it is an effective, time-efficient, adolescent-friendly tool that facilitated identification and discussion of the orphaned adolescents' grieving patterns. Crying, isolation, silence and violent outbursts were the main grieving patterns reported by adolescents orphaned by AIDS. The researcher recommends use of the event history calendar for identification of orphaned adolescents' grieving experiences. Early identification would enable mental health practitioners to support them in order to prevent the occurrence of mental illness due to maladaptive grieving.

  5. Analysis on proton fluxes during several solar events with the PAMELA experiment

    NASA Astrophysics Data System (ADS)

    Martucci, Matteo

    2015-04-01

    The charged particle production during solar events have been widely modelized in the past decades. The satellite-borne PAMELA experiment has been continuously collecting data since 2006. This apparatus is designed to study charged particles in the cosmic radiation. The combination of permanent magnet, silicon strip spectrometer and silicon-tungsten imaging calorimeter, with the redundancy of instrumentation allows very precise studies on the physics of cosmic rays in a wide energy range and with high statistics. This makes PAMELA a very suitable instrument for Solar Energetic Particle (SEP) observations. Not only does it span the energy range between the ground-based neutron monitor data and the observations of SEPs from space, but also PAMELA carries out the first direct measurements of the composition for the highest energy SEP events. PAMELA has registered many SEP events in solar cycle 24, offering unique opportunities to address the question of high-energy SEP origin. A preliminar analysis on proton spectra during several events of the 24th solar cycle is presented.

  6. Identification and analysis of alternative splicing events conserved in human and mouse

    PubMed Central

    Yeo, Gene W.; Van Nostrand, Eric; Holste, Dirk; Poggio, Tomaso; Burge, Christopher B.

    2005-01-01

    Alternative pre-mRNA splicing affects a majority of human genes and plays important roles in development and disease. Alternative splicing (AS) events conserved since the divergence of human and mouse are likely of primary biological importance, but relatively few of such events are known. Here we describe sequence features that distinguish exons subject to evolutionarily conserved AS, which we call alternative conserved exons (ACEs), from other orthologous human/mouse exons and integrate these features into an exon classification algorithm, acescan. Genome-wide analysis of annotated orthologous human–mouse exon pairs identified ≈2,000 predicted ACEs. Alternative splicing was verified in both human and mouse tissues by using an RT-PCR-sequencing protocol for 21 of 30 (70%) predicted ACEs tested, supporting the validity of a majority of acescan predictions. By contrast, AS was observed in mouse tissues for only 2 of 15 (13%) tested exons that had EST or cDNA evidence of AS in human but were not predicted ACEs, and AS was never observed for 11 negative control exons in human or mouse tissues. Predicted ACEs were much more likely to preserve the reading frame and less likely to disrupt protein domains than other AS events and were enriched in genes expressed in the brain and in genes involved in transcriptional regulation, RNA processing, and development. Our results also imply that the vast majority of AS events represented in the human EST database are not conserved in mouse. PMID:15708978

  7. Network meta-analysis of (individual patient) time to event data alongside (aggregate) count data

    PubMed Central

    2014-01-01

    Background Network meta-analysis methods extend the standard pair-wise framework to allow simultaneous comparison of multiple interventions in a single statistical model. Despite published work on network meta-analysis mainly focussing on the synthesis of aggregate data, methods have been developed that allow the use of individual patient-level data specifically when outcomes are dichotomous or continuous. This paper focuses on the synthesis of individual patient-level and summary time to event data, motivated by a real data example looking at the effectiveness of high compression treatments on the healing of venous leg ulcers. Methods This paper introduces a novel network meta-analysis modelling approach that allows individual patient-level (time to event with censoring) and summary-level data (event count for a given follow-up time) to be synthesised jointly by assuming an underlying, common, distribution of time to healing. Alternative model assumptions were tested within the motivating example. Model fit and adequacy measures were used to compare and select models. Results Due to the availability of individual patient-level data in our example we were able to use a Weibull distribution to describe time to healing; otherwise, we would have been limited to specifying a uniparametric distribution. Absolute effectiveness estimates were more sensitive than relative effectiveness estimates to a range of alternative specifications for the model. Conclusions The synthesis of time to event data considering individual patient-level data provides modelling flexibility, and can be particularly important when absolute effectiveness estimates, and not just relative effect estimates, are of interest. PMID:25209121

  8. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    PubMed Central

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  9. The impact of dropouts on the analysis of dose-finding studies with recurrent event data.

    PubMed

    Akacha, Mouna; Benda, Norbert

    2010-07-10

    This work is motivated by dose-finding studies, where the number of events per subject within a specified study period form the primary outcome. The aim of the considered studies is to identify the target dose for which the new drug can be shown to be as effective as a competitor medication. Given a pain-related outcome, we expect a considerable number of patients to drop out before the end of the study period. The impact of missingness on the analysis and models for the missingness process must be carefully considered.The recurrent events are modeled as over-dispersed Poisson process data, with dose as the regressor. Additional covariates may be included. Constant and time-varying rate functions are examined. Based on these models, the impact of missingness on the precision of the target dose estimation is evaluated. Diverse models for the missingness process are considered, including dependence on covariates and number of events. The performances of five different analysis methods are assessed via simulations: a complete case analysis; two analyses using different single imputation techniques; a direct-likelihood analysis and an analysis using pattern-mixture models.The target dose estimation is robust if the same missingness process holds for the target dose group and the active control group. Furthermore, we demonstrate that this robustness is lost as soon as the missingness mechanisms for the active control and the target dose differ. Of the methods explored, the direct-likelihood approach performs best, even when a missing not at random mechanism holds.

  10. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  11. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    PubMed

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus.

  12. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  13. Forward flux sampling-type schemes for simulating rare events: efficiency analysis.

    PubMed

    Allen, Rosalind J; Frenkel, Daan; ten Wolde, Pieter Rein

    2006-05-21

    We analyze the efficiency of several simulation methods which we have recently proposed for calculating rate constants for rare events in stochastic dynamical systems in or out of equilibrium. We derive analytical expressions for the computational cost of using these methods and for the statistical error in the final estimate of the rate constant for a given computational cost. These expressions can be used to determine which method to use for a given problem, to optimize the choice of parameters, and to evaluate the significance of the results obtained. We apply the expressions to the two-dimensional nonequilibrium rare event problem proposed by Maier and Stein [Phys. Rev. E 48, 931 (1993)]. For this problem, our analysis gives accurate quantitative predictions for the computational efficiency of the three methods.

  14. The May 17, 2012 Solar Event: Back-Tracing Analysis and Flux Reconstruction with PAMELA

    NASA Technical Reports Server (NTRS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, Eric R.

    2016-01-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  15. Video analysis of dust events in full-tungsten ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Brochard, F.; Shalpegin, A.; Bardin, S.; Lunt, T.; Rohde, V.; Briançon, J. L.; Pautasso, G.; Vorpahl, C.; Neu, R.; The ASDEX Upgrade Team

    2017-03-01

    Fast video data recorded during seven consecutive operation campaigns (2008-2012) in full-tungsten ASDEX Upgrade have been analyzed with an algorithm developed to automatically detect and track dust particles. A total of 2425 discharges have been analyzed, corresponding to 12 204 s of plasma operation. The analysis aimed at precisely identifying and sorting the discharge conditions responsible of the dust generation or remobilization. Dust rates are found to be significantly lower than in tokamaks with carbon PFCs. Significant dust events occur mostly during off-normal plasma phases such as disruptions and particularly those preceded by vertical displacement events (VDEs). Dust rates are also increased but to a lower extent during type-I ELMy H-modes. The influences of disruption energy, heating scenario, vessel venting and vessel vibrations are also presented.

  16. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    PubMed

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  17. The May 17, 2012 solar event: back-tracing analysis and flux reconstruction with PAMELA

    NASA Astrophysics Data System (ADS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bravar, U.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; Christian, E. C.; De Donato, C.; de Nolfo, G. A.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A. M.; Karelin, A. V.; Koldashov, S. V.; Koldobskiy, S.; Krutkov, S. Y.; Kvashnin, A. N.; Lee, M.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A. G.; Menn, W.; Mergè, M.; Mikhailov, V. V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S. B.; Ryan, J. M.; Sarkar, R.; Scotti, V.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stochaj, S.; Stozhkov, Y. I.; Vacchi, A.; Vannuccini, E.; Vasilyev, G. I.; Voronov, S. A.; Yurkin, Y. T.; Zampa, G.; Zampa, N.; Zverev, V. G.

    2016-02-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  18. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables

    PubMed Central

    Yin, Yu; Yao, Dezhong

    2016-01-01

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an “event of relation” with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals. PMID:27389921

  19. Causal Inference Based on the Analysis of Events of Relations for Non-stationary Variables.

    PubMed

    Yin, Yu; Yao, Dezhong

    2016-07-08

    The main concept behind causality involves both statistical conditions and temporal relations. However, current approaches to causal inference, focusing on the probability vs. conditional probability contrast, are based on model functions or parametric estimation. These approaches are not appropriate when addressing non-stationary variables. In this work, we propose a causal inference approach based on the analysis of Events of Relations (CER). CER focuses on the temporal delay relation between cause and effect, and a binomial test is established to determine whether an "event of relation" with a non-zero delay is significantly different from one with zero delay. Because CER avoids parameter estimation of non-stationary variables per se, the method can be applied to both stationary and non-stationary signals.

  20. Seamless Level 2/Level 3 probabilistic risk assessment using dynamic event tree analysis

    NASA Astrophysics Data System (ADS)

    Osborn, Douglas Matthew

    The current approach to Level 2 and Level 3 probabilistic risk assessment (PRA) using the conventional event-tree/fault-tree methodology requires pre-specification of event order occurrence which may vary significantly in the presence of uncertainties. Manual preparation of input data to evaluate the possible scenarios arising from these uncertainties may also lead to errors from faulty/incomplete input preparation and their execution using serial runs may lead to computational challenges. A methodology has been developed for Level 2 analysis using dynamic event trees (DETs) that removes these limitations with systematic and mechanized quantification of the impact of aleatory uncertainties on possible consequences and their likelihoods. The methodology is implemented using the Analysis of Dynamic Accident Progression Trees (ADAPT) software. For the purposes of this work, aleatory uncertainties are defined as those arising from the stochastic nature of the processes under consideration, such as the variability of weather, in which the probability of weather patterns is predictable but the conditions at the time of the accident are a matter of chance. Epistemic uncertainties are regarded as those arising from the uncertainty in the model (system code) input parameters (e.g., friction or heat transfer correlation parameters). This work conducts a seamless Level 2/3 PRA using a DET analysis. The research helps to quantify and potentially reduce the magnitude of the source term uncertainty currently experienced in Level 3 PRA. Current techniques have been demonstrated with aleatory uncertainties for environmental releases of radioactive materials. This research incorporates epistemic and aleatory uncertainties in a phenomenologically consistent manner through use of DETs. The DETs were determined using the ADAPT framework and linking ADAPT with MELCOR, MELMACCS, and the MELCOR Accident Consequence Code System, Version 2. Aleatory and epistemic uncertainties incorporated

  1. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    NASA Astrophysics Data System (ADS)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  2. Meta-Analysis of Relation of Vital Exhaustion to Cardiovascular Disease Events.

    PubMed

    Cohen, Randy; Bavishi, Chirag; Haider, Syed; Thankachen, Jincy; Rozanski, Alan

    2017-04-15

    To assess the net impact of vital exhaustion on cardiovascular events and all-cause mortality, we conducted a systematic search of PubMed, EMBASE, and PsychINFO (through April 2016) to identify all studies which investigated the relation between vital exhaustion (VE) and health outcomes. Inclusion criteria were as follows: (1) a cohort study (prospective cohort or historical cohort) consisting of adults (>18 years); (2) at least 1 self-reported or interview-based assessment of VE or exhaustion; (3) evaluated the association between vital exhaustion or exhaustion and relevant outcomes; and (4) reported adjusted risk estimates of vital exhaustion/exhaustion for outcomes. Maximally adjusted effect estimates with 95% CIs along with variables used for adjustment in multivariate analysis were also abstracted. Primary study outcome was cardiovascular events. Secondary outcomes were stroke and all-cause mortality. Seventeen studies (19 comparisons) with a total of 107,175 participants were included in the analysis. Mean follow-up was 6 years. VE was significantly associated with an increased risk for cardiovascular events (relative risk 1.53, 95% CI 1.28 to 1.83, p <0.001) and all-cause mortality (relative risk 1.48, 95% CI 1.28 to 1.72, p <0.001). VE also showed a trend for increased incident stroke (relative risk 1.46, 95% CI 0.97 to 2.21, p = 0.07). Subgroup analyses yielded similar results. VE is a significant risk factor for cardiovascular events, comparable in potency to common psychosocial risk factors. Our results imply a need to more closely study VE, and potentially related states of exhaustion, such as occupational burnout.

  3. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    SciTech Connect

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J. )

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PWR) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs.

  4. Uncertainty Analysis of Climate Change Impact on Extreme Rainfall Events in the Apalachicola River Basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, D.; Hagen, S.; Bacopoulos, P.

    2011-12-01

    Climate change impact on the rainfall patterns during the summer season (May -- August) at the Apalachicola River basin (Florida Panhandle coast) is assessed using ensemble regional climate models (RCMs). Rainfall data for both baseline and future years (30-year periods) are obtained from North American Regional Climate Change Assessment Program (NARCCAP) where the A2 emission scenario is used. Trend analysis is conducted based on historical rainfall data from three weather stations. Two methods are used to assess the climate change impact on the rainfall intensity-duration-frequency (IDF) curves, i.e., maximum intensity percentile-based method and sequential bias correction and maximum intensity percentile-based method. As a preliminary result from one RCM, extreme rainfall intensity is found to increase significantly with the increase in rainfall intensity increasing more dramatically with closer proximity to the coast. The projected rainfall pattern changes (spatial and temporal, mean and extreme values) provide guidance for developing adaptation and mitigation strategies on water resources management and ecosystem protections. More rainfall events move from July to June during future years for all three stations; in the upstream, the variability of time occurrence of extreme rainfall increases and more extreme events are shown to occur in June and August instead of May. These temporal shifts of extreme rainfall events will increase the probability of simultaneous heavy rainfall in the downstream and upstream in June during which flooding will be enhanced. The uncertainty analysis on the climate change impact on extreme rainfall events will be presented based on the simulations from the ensemble of RCMs.

  5. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  6. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  7. An event-related analysis of P300 by simultaneous EEG/fMRI

    NASA Astrophysics Data System (ADS)

    Wang, Li-qun; Wang, Mingshi; Mizuhara, Hiroaki

    2006-09-01

    In this study, P300 that induced by visual stimuli was examined with simultaneous EEG/fMRI. For the purpose of combine the best temporary resolution with the best special resolution together to estimate the brain function, event-related analysis contributed to this methodological trial. A 64 channel MRT-compatible MR EEG amplifier (BrainAmp: made of Brain Production GmbH, Gennany) was used in the measurement simultaneously with fMRI scanning. The reference channel is between Fz, Cz and Pz. Sampling rate of raw EEG was 5 kHz, and the MRT noise reduction was performed. EEG recording synchronized with MRI scan by our original stimulus system, and an oddball paradigm (four-oriented Landolt Ring presentation) was performed in the official manner. After P300 segmentation, the timing of P300 was exported to event-related analysis of fMRI data with SPM99 software. In single subject study, the significant activations appear in the left superior frontal, Broca's area and on both sides of the parietal lobule when P300 occurred. It is suggest that P300 may be an integration carried out by top-down signal from frontal to the parietal lobule, which regulates an Attention-Logical Judgment process. Compared with other current methods, the event related analysis by simultaneous EEG/IMRI is excellent in the point that can describe the cognitive process with reality unifying further temporary and spatial information. It is expected that examination and demonstration of the obtained result will supply with the promotion of this powerful methods.

  8. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    NASA Astrophysics Data System (ADS)

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  9. Time-frequency analysis of event-related potentials: a brief tutorial.

    PubMed

    Herrmann, Christoph S; Rach, Stefan; Vosskuhl, Johannes; Strüber, Daniel

    2014-07-01

    Event-related potentials (ERPs) reflect cognitive processes and are usually analyzed in the so-called time domain. Additional information on cognitive functions can be assessed when analyzing ERPs in the frequency domain and treating them as event-related oscillations (EROs). This procedure results in frequency spectra but lacks information about the temporal dynamics of EROs. Here, we describe a method-called time-frequency analysis-that allows analyzing both the frequency of an ERO and its evolution over time. In a brief tutorial, the reader will learn how to use wavelet analysis in order to compute time-frequency transforms of ERP data. Basic steps as well as potential artifacts are described. Rather than in terms of formulas, descriptions are in textual form (written text) with numerous figures illustrating the topics. Recommendations on how to present frequency and time-frequency data in journal articles are provided. Finally, we briefly review studies that have applied time-frequency analysis to mismatch negativity paradigms. The deviant stimulus of such a paradigm evokes an ERO in the theta frequency band that is stronger than for the standard stimulus. Conversely, the standard stimulus evokes a stronger gamma-band response than does the deviant. This is interpreted in the context of the so-called match-and-utilization model.

  10. Scientific Content Analysis (SCAN) Cannot Distinguish Between Truthful and Fabricated Accounts of a Negative Event

    PubMed Central

    Bogaard, Glynis; Meijer, Ewout H.; Vrij, Aldert; Merckelbach, Harald

    2016-01-01

    The Scientific Content Analysis (SCAN) is a verbal veracity assessment method that is currently used worldwide by investigative authorities. Yet, research investigating the accuracy of SCAN is scarce. The present study tested whether SCAN was able to accurately discriminate between true and fabricated statements. To this end, 117 participants were asked to write down one true and one fabricated statement about a recent negative event that happened in their lives. All statements were analyzed using 11 criteria derived from SCAN. Results indicated that SCAN was not able to correctly classify true and fabricated statements. Lacking empirical support, the application of SCAN in its current form should be discouraged. PMID:26941694

  11. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    NASA Astrophysics Data System (ADS)

    Wouters, J.; Bouchet, F.

    2016-09-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.

  12. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  13. Mechanical Engineering Safety Note: Analysis and Control of Hazards Associated with NIF Capacitor Module Events

    SciTech Connect

    Brereton, S

    2001-08-01

    the total free oil available in a capacitor (approximately 10,900 g), on the order of 5% or less. The estimates of module pressure were used to estimate the potential overpressure in the capacitor bays after an event. It was shown that the expected capacitor bay overpressure would be less than the structural tolerance of the walls. Thus, it does not appear necessary to provide any pressure relief for the capacitor bays. The ray tracing analysis showed the new module concept to be 100% effective at containing fragments generated during the events. The analysis demonstrated that all fragments would impact an energy absorbing surface on the way out of the module. Thus, there is high confidence that energetic fragments will not escape the module. However, since the module was not tested, it was recommended that a form of secondary containment on the walls of the capacitor bays (e.g., 1.0 inch of fire-retardant plywood) be provided. Any doors to the exterior of the capacitor bays should be of equivalent thickness of steel or suitably armed with a thickness of plywood. Penetrations in the ceiling of the interior bays (leading to the mechanical equipment room) do not require additional protection to form a secondary barrier. The mezzanine and the air handling units (penetrations lead directly to the air handling units) provide a sufficient second layer of protection.

  14. Classification of persistent heavy rainfall events over South China and associated moisture source analysis

    NASA Astrophysics Data System (ADS)

    Liu, Ruixin; Sun, Jianhua; Wei, Jie; Fu, Shenming

    2016-08-01

    Persistent heavy rainfall events (PHREs) over South China during 1981-2014 were selected and classified by an objective method, based on the daily precipitation data at 752 stations in China. The circulation characteristics, as well as the dry-cold air and moisture sources of each type of PHREs were examined. The main results are as follows. A total of 32 non-typhoon influenced PHREs in South China were identified over the study period. By correlation analysis, the PHREs are divided into three types: SC-A type, with its main rainbelt located in the coastal areas and the northeast of Guangdong Province; SC-B type, with its main rainbelt between Guangdong Province and Guangxi Region; and SC-C type, with its main rainbelt located in the north of Guangxi Region. For the SC-A events, dry-cold air flew to South China under the steering effect of troughs in the middle troposphere which originated from the Ural Mountains and West Siberia Plain; whereas, the SC-C events were not influenced by the cold air from high latitudes. There were three water vapor pathways from low-latitude areas for both the SC-A and SC-C PHREs. The tropical Indian Ocean was the main water vapor source for these two PHRE types, while the South China Sea also contributed to the SC-C PHREs. In addition, the SC-A events were also influenced by moist and cold air originating from the Yellow Sea. Generally, the SC-C PHREs belonged to a warm-sector rainfall type, whose precipitation areas were dominated by southwesterly wind, and the convergence in wind speed was the main reason for precipitation.

  15. Evaluation of automated streamwater sampling during storm events for total mercury analysis.

    PubMed

    Riscassi, Ami L; Converse, Amber D; Hokanson, Kelly J; Scanlon, Todd M

    2010-10-06

    Understanding the processes by which mercury is mobilized from soil to stream is currently limited by a lack of observations during high-flow events, when the majority of this transport occurs. An automated technique to collect stream water for unfiltered total mercury (HgT) analysis was systematically evaluated in a series of laboratory experiments. Potential sources of error investigated were 1) carry-over effects associated with sequential sampling, 2) deposition of HgT into empty bottles prior to sampling, and 3) deposition to or evasion from samples prior to retrieval. Contamination from carry-over effects was minimal (<2%) and HgT deposition to open bottles was negligible. Potentially greater errors are associated with evasive losses of HgT from uncapped samples, with higher temperatures leading to greater evasion. These evasive losses were found to take place primarily within the first eight hours. HgT associated with particulate material is much less prone to evasion than HgT in dissolved form. A field test conducted during a high-flow event confirmed unfiltered HgT concentrations sampled with an automated system were comparable to those taken manually, as the mean absolute difference between automated and manual samples (10%) was similar to the mean difference between duplicate grab samples (9%). Results from this study have demonstrated that a standard automated sampler, retrofitted with appropriately cleaned fluoropolymer tubing and glass bottles, can effectively be used for collection of streamwater during high-flow events for low-level mercury analysis.

  16. Wood anatomical analysis of Alnus incana and Betula pendula injured by a debris-flow event.

    PubMed

    Arbellay, Estelle; Stoffel, Markus; Bollschweiler, Michelle

    2010-10-01

    Vessel chronologies in ring-porous species have been successfully employed in the past to extract the climate signal from tree rings. Environmental signals recorded in vessels of ring-porous species have also been used in previous studies to reconstruct discrete events of drought, flooding and insect defoliation. However, very little is known about the ability of diffuse-porous species to record environmental signals in their xylem cells. Moreover, time series of wood anatomical features have only rarely been used to reconstruct former geomorphic events. This study was therefore undertaken to characterize the wood anatomical response of diffuse-porous Alnus incana (L.) Moench and Betula pendula Roth to debris-flow-induced wounding. Tree microscopic response to wounding was assessed through the analysis of wood anatomical differences between injured rings formed in the debris-flow event year and uninjured rings formed in the previous year. The two ring types were examined close and opposite to the injury in order to determine whether wound effects on xylem cells decrease with increasing tangential distance from the injury. Image analysis was used to measure vessel parameters as well as fiber and parenchyma cell (FPC) parameters. The results of this study indicate that injured rings are characterized by smaller vessels as compared with uninjured rings. By contrast, FPC parameters were not found to significantly differ between injured and uninjured rings. Vessel and FPC parameters mainly remained constant with increasing tangential distance from the injury, except for a higher proportion of vessel lumen area opposite to the injury within A. incana. This study highlights the existence of anatomical tree-ring signatures-in the form of smaller vessels-related to past debris-flow activity and addresses a new methodological approach to date injuries inflicted on trees by geomorphic processes.

  17. Analysis of the variation of the 0°C isothermal altitude during rainfall events

    NASA Astrophysics Data System (ADS)

    Zeimetz, Fränz; Garcìa, Javier; Schaefli, Bettina; Schleiss, Anton J.

    2016-04-01

    In numerous countries of the world (USA, Canada, Sweden, Switzerland,…), the dam safety verifications for extreme floods are realized by referring to the so called Probable Maximum Flood (PMF). According to the World Meteorological Organization (WMO), this PMF is determined based on the PMP (Probable Maximum Precipitation). The PMF estimation is performed with a hydrological simulation model by routing the PMP. The PMP-PMF simulation is normally event based; therefore, if no further information is known, the simulation needs assumptions concerning the initial soil conditions such as saturation or snow cover. In addition, temperature series are also of interest for the PMP-PMF simulations. Temperature values can not only be deduced from temperature measurement but also using the temperature gradient method, the 0°C isothermal altitude can lead to temperature estimations on the ground. For practitioners, the usage of the isothermal altitude for referring to temperature is convenient and simpler because one value can give information over a large region under the assumption of a certain temperature gradient. The analysis of the evolution of the 0°C isothermal altitude during rainfall events is aimed here and based on meteorological soundings from the two sounding stations Payerne (CH) and Milan (I). Furthermore, hourly rainfall and temperature data are available from 110 pluviometers spread over the Swiss territory. The analysis of the evolution of the 0°C isothermal altitude is undertaken for different precipitation durations based on the meteorological measurements mentioned above. The results show that on average, the isothermal altitude tends to decrease during the rainfall events and that a correlation between the duration of the altitude loss and the duration of the rainfall exists. A significant difference in altitude loss is appearing when the soundings from Payerne and Milan are compared.

  18. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  19. Analysis of Loss-of-Offsite-Power Events 1998–2013

    SciTech Connect

    Schroeder, John Alton

    2015-02-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses suggest that loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2013. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The emergency diesel generator failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. No statistically significant trends in LOOP frequencies over the 1997–2013 period are identified. There is a possibility that a significant trend in grid-related LOOP frequency exists that is not easily detected by a simple analysis. Statistically significant increases in recovery times after grid- and switchyard-related LOOPs are identified.

  20. Detecting event-related recurrences by symbolic analysis: applications to human language processing

    PubMed Central

    beim Graben, Peter; Hutt, Axel

    2015-01-01

    Quasi-stationarity is ubiquitous in complex dynamical systems. In brain dynamics, there is ample evidence that event-related potentials (ERPs) reflect such quasi-stationary states. In order to detect them from time series, several segmentation techniques have been proposed. In this study, we elaborate a recent approach for detecting quasi-stationary states as recurrence domains by means of recurrence analysis and subsequent symbolization methods. We address two pertinent problems of contemporary recurrence analysis: optimizing the size of recurrence neighbourhoods and identifying symbols from different realizations for sequence alignment. As possible solutions for these problems, we suggest a maximum entropy criterion and a Hausdorff clustering algorithm. The resulting recurrence domains for single-subject ERPs are obtained as partition cells reflecting quasi-stationary brain states. PMID:25548270

  1. Short Term Forecasts of Volcanic Activity Using An Event Tree Analysis System and Logistic Regression

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Jones, W. L.; Woods, M. T.

    2011-12-01

    An automated event tree analysis system for estimating the probability of short term volcanic activity is presented. The algorithm is driven by a suite of empirical statistical models that are derived through logistic regression. Each model is constructed from a multidisciplinary dataset that was assembled from a collection of historic volcanic unrest episodes. The dataset consists of monitoring measurements (e.g. InSAR, seismic), source modeling results, and historic eruption activity. This provides a simple mechanism for simultaneously accounting for the geophysical changes occurring within the volcano and the historic behavior of analog volcanoes. The algorithm is extensible and can be easily recalibrated to include new or additional monitoring, modeling, or historic information. Standard cross validation techniques are employed to optimize its forecasting capabilities. Analysis results from several recent volcanic unrest episodes are presented.

  2. Spousal communication and contraceptive use in rural Nepal: an event history analysis.

    PubMed

    Link, Cynthia F

    2011-06-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication-contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives' and husbands' perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives' reports of communication are substantial explanatory factors in couples' initiation of all contraceptive methods examined. Husbands' reports of communication predict couples'subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics--as well as husbands' perceptions of these dynamics--influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs.

  3. 76 FR 70768 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request...

  4. Modeling propensity to move after job change using event history analysis and temporal GIS

    NASA Astrophysics Data System (ADS)

    Vandersmissen, Marie-Hélène; Séguin, Anne-Marie; Thériault, Marius; Claramunt, Christophe

    2009-03-01

    The research presented in this paper analyzes the emergent residential behaviors of individual actors in a context of profound social changes in the work sphere. It incorporates a long-term view in the analysis of the relationships between social changes in the work sphere and these behaviors. The general hypothesis is that social changes produce complex changes in the long-term dynamics of residential location behavior. More precisely, the objective of this paper is to estimate the propensity for professional workers to move house after a change of workplace. Our analysis draws on data from a biographical survey using a retrospective questionnaire that enables a posteriori reconstitution of the familial, professional and residential lifelines of professional workers since their departure from their parents’ home. The survey was conducted in 1996 in the Quebec City Metropolitan Area, which, much like other Canadian cities, has experienced a substantial increase in “unstable” work, even for professionals. The approach is based on event history analysis, a Temporal Geographic Information System and exploratory spatial analysis of model’s residuals. Results indicate that 48.9% of respondents moved after a job change and that the most important factors influencing the propensity to move house after a job change are home tenure (for lone adults as for couple) and number of children (for couples only). We also found that moving is associated with changing neighborhood for owners while tenants or co-tenants tend to stay in the same neighborhood. The probability of moving 1 year after a job change is 0.10 for lone adults and couples while after 2 years, the household structure seems to have an impact: the probability increased to 0.23 for lone adults and to 0.21 for couples. The outcome of this research contributes to furthering our understanding of a familial decision (to move) following a professional event (change of job), controlling for household structure

  5. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    NASA Astrophysics Data System (ADS)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  6. Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.

    2013-12-01

    Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web

  7. BOLIVAR-tool for analysis and simulation of metocean extreme events

    NASA Astrophysics Data System (ADS)

    Lopatoukhin, Leonid; Boukhanovsky, Alexander

    2015-04-01

    Metocean extreme events are caused by the combination of multivariate and multiscale processes which depend from each other in different scales (due to short-term, synoptic, annual, year-to-year variability). There is no simple method for their estimation with controllable tolerance. Thus, the extreme analysis in practice is sometimes reduced to the exploration of various methods and models in respect to decreasing the uncertainty of estimates. Therefore, a researcher needs the multifaceted computational tools which cover the various branches of extreme analysis. BOLIVAR is the multi-functional computational software for the researches and engineers who explore the extreme environmental conditions to design and build offshore structures and floating objects. It contains a set of computational modules of various methods for extreme analysis, and a set of modules for the stochastic and hydrodynamic simulation of metocean processes. In this sense BOLIVAR is a Problem Solving Environment (PSE). The BOLIVAR is designed for extreme events analysis and contains a set of computational modules of IDM, AMS, POT, MENU, and SINTEF methods, and a set of modules for stochastic simulation of metocean processes in various scales. The BOLIVAR is the tool to simplify the resource-consuming computational experiments to explore the metocean extremes in univariate and multivariate cases. There are field ARMA models for short-term variability, spatial-temporal random pulse model for synoptic variability (storms and calms alteration), cyclostationare model of annual and year-to-year variability. The combination of above mentioned modules and data sources allows to estimate: omnidirectional and directional extremes (with T-years return periods); multivariate extremes (the set of parameters) and evaluation of their impacts to marine structures and floating objects; extremes of spatial-temporal fields (including the trajectory of T-years storms). An employment of concurrent methods for

  8. Multi-parameter Analysis and Visualization of Groundwater Quality during High River Discharge Events

    NASA Astrophysics Data System (ADS)

    Page, R. M.; Huggenberger, P.; Lischeid, G.

    2010-12-01

    The filter capacity of alluvial aquifers enables many groundwater extraction wells near rivers to provide high-quality drinking water during average flow and surface water quality conditions. However, during high river discharge events, the bacterial load of the groundwater is increased and the extracted water is no longer safe for the production of drinking water without treatment. Optimal management of production wells requires well-founded knowledge of the river - groundwater interaction and transport of microorganisms over this interface. Due to the spatial and temporal variability of river - groundwater interaction, monitoring individual parameters does not always correctly identify the actual potential risk of contamination of drinking water. Identifying situations where the quality is insufficient can be difficult in systems that are influenced by many factors including natural and artificial recharge, as well as extraction. As high-resolution sampling for waterborne pathogens during flood events is cost and time intensive, proxies are usually used in addition to short-term microbial monitoring studies. The resulting datasets are multi-dimensional and have variable temporal resolutions. For these reasons, it is necessary to apply procedures where multivariate datasets can be considered simultaneously and inherent patterns visualized. These patterns are important for determining the governing processes and can be used to assess the actual potential risk of contamination due to infiltrating surface water. In this study, a multi-parameter dataset, including specific conductivity and faecal indicators (Escherichia coli, enterococci and aerobic mesophilic germs), was analyzed using a combination of the Self-Organizing Map (SOM) and Sammon's mapping techniques. The SOM analysis allowed to differentiate between the effects of groundwater extraction and fluctuations of the river table on groundwater levels, electric conductivity and temperature in the well field

  9. The Identification of Seismo and Volcanomagnetic Events Using Non-stationary Analysis of Geomagnetic Field Variations.

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Gonçalves, P.; Johnston, M.; La Manna, M.

    Many studies have shown a clear correlation between volcanic and/or seismic activ- ity and time variations of local geomagnetic fields, called seismomagnetic (SM) and /or volcanomagnetic (VM) effects. SM and VM can be generated from various phys- ical process, such as piezomagnetism, tectonomagnetism and electrokinetism. Rele- vant parameters are the event duration, the event magnitude and the magnetometer sample rate. Here, we present some results obtained from a non-stationary analysis of geomagnetic time series that focuses on automatic detection of possible SM and VM events. Several approaches are considered. The first one, based on the continuous wavelet transform, provides us with a multiresolution lecture of the signal, expanded in time-scale space. The second uses a time-variant adaptive algorithm (RLS) that al- lows the detection of some time intervals where important statistical variations of the signal occur. Finally, we investigate a third technique relying on multifractal analy- sis. This latter allows estimation of local regularity of a time series path, in order to detect unusual singularities. Different multifractal models were used for testing the methodology, such as multifractional Brownian Motions (mbmSs), before applying it to synthetic simulations of geomagnetic signals. In our simulations, we took into account theoretical SM and/or VM effects deriving from fault rupture and overpres- sured magma chambers. We applied these methodologies to two different real world data sets, recorded on Mt Etna (volcanic area) during the volcanic activity occurred in 1981, and in North Palm Springs (seismic area) during the seism of July 8th 1986, respectively. In both cases, all techniques were effective in automatically identifying the geomagnetic time-variations likely inferred by volcanic and/or seismic activity and the results are in good agreement with the indices provided by real volcanic and seismic measurements.

  10. The Identification of Seismo and Volcanomagnetic Events Using Non-stationary Analysis of Geomagnetic Field Variations.

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Gonçalves, P.; Johnston, M.; La Manna, M.

    Many studies have shown a clear correlation between volcanic and/or seismic activ- ity and time variations of local geomagnetic fields, called seismomagnetic (SM) and /or volcanomagnetic (VM) effects. SM and VM can be generated from various phys- ical process, such as piezomagnetism, tectonomagnetism and electrokinetism. Rele- vant parameters are the event duration, the event magnitude and the magnetometer sample rate. Here, we present some results obtained from a non-stationary analysis of geomagnetic time series that focuses on automatic detection of possible SM and VM events. Several approaches are considered. The first one, based on the continuous wavelet transform, provides us with a multiresolution lecture of the signal, expanded in time-scale space. The second uses a time-variant adaptive algorithm (RLS) that al- lows the detection of some time intervals where important statistical variations of the signal occur. Finally, we investigate a third technique relying on multifractal analy- sis. This latter allows estimation of local regularity of a time series path, in order to detect unusual singularities. Different multifractal models were used for testing the methodology, such as multifractional Brownian Motions (mbm 's), before applying it to synthetic simulations of geomagnetic signals. In our simulations, we took into account theoretical SM and/or VM effects deriving from fault rupture and overpres- sured magma chambers. We applied these methodologies to two different real world data sets, recorded on Mt Etna (volcanic area) during the volcanic activity occurred in 1981, and in North Palm Springs (seismic area) during the seism of July 8th 1986, respectively. In both cases, all techniques were effective in automatically identifying the geomagnetic time-variations likely inferred by volcanic and/or seismic activity in fact we obtained results in good agreement with the indices provided by real volcanic and seismic measurements.

  11. Analysis of a snowfall event produced by mountains waves in Guadarrama Mountains (Spain)

    NASA Astrophysics Data System (ADS)

    Gascón, Estíbaliz; Sánchez, José Luis; Fernández-González, Sergio; Merino, Andrés; López, Laura; García-Ortega, Eduardo

    2014-05-01

    Heavy snowfall events are fairly uncommon precipitation processes in the Iberian Peninsula. When large amounts of snow accumulate in large cities with populations that are unaccustomed to or unprepared for heavy snow, these events have a major impact on their daily activities. On 16 January 2013, an extreme snowstorm occurred in Guadarrama Mountains (Madrid, Spain) during an experimental winter campaign as a part of the TECOAGUA Project. Strong northwesterly winds, high precipitation and temperatures close to 0°C were detected throughout the whole day. During this episode, it was possible to continuously take measurements of different variables involved in the development of the convection using a multichannel microwave radiometer (MMWR). The significant increase in the cloud thickness observed vertically by the MMWR and registered precipitation of 43 mm in 24 hours at the station of Navacerrada (Madrid) led us to consider that we were facing an episode of strong winter convection. Images from the Meteosat Second Generation (MSG) satellite suggested that the main source of the convection was the formation of mountain waves on the south face of the Guadarrama Mountains. The event was simulated in high resolution using the WRF mesoscale model, an analysis of which is based on the observational simulations and data. Finally, the continuous measurements obtained with the MMWR allowed us to monitor the vertical situation above the Guadarrama Mountains with temporal resolution of 2 minutes. This instrument has a clear advantage in monitoring short-term episodes of this kind in comparison to radiosondes, which usually produce data at 0000 and 1200 UTC. Acknowledgements This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22). The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2.

  12. Social network changes and life events across the life span: a meta-analysis.

    PubMed

    Wrzus, Cornelia; Hänel, Martha; Wagner, Jenny; Neyer, Franz J

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network changes and the effects of life events on social networks using 277 studies with 177,635 participants from adolescence to old age. Cross-sectional as well as longitudinal studies consistently showed that (a) the global social network increased up until young adulthood and then decreased steadily, (b) both the personal network and the friendship network decreased throughout adulthood, (c) the family network was stable in size from adolescence to old age, and (d) other networks with coworkers or neighbors were important only in specific age ranges. Studies focusing on life events that occur at specific ages, such as transition to parenthood, job entry, or widowhood, demonstrated network changes similar to such age-related network changes. Moderator analyses detected that the type of network assessment affected the reported size of global, personal, and family networks. Period effects on network sizes occurred for personal and friendship networks, which have decreased in size over the last 35 years. Together the findings are consistent with the view that a portion of normative, age-related social network changes are due to normative, age-related life events. We discuss how these patterns of normative social network development inform research in social, evolutionary, cultural, and personality psychology.

  13. Radar analysis of the life cycle of Mesoscale Convective Systems during the 10 June 2000 event

    NASA Astrophysics Data System (ADS)

    Rigo, T.; Llasat, M. C.

    2005-12-01

    The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.

  14. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    NASA Astrophysics Data System (ADS)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  15. Kickoff to Conflict: A Sequence Analysis of Intra-State Conflict-Preceding Event Structures

    PubMed Central

    D'Orazio, Vito; Yonamine, James E.

    2015-01-01

    While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998–2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace. PMID:25951105

  16. Forecasting and nowcasting process: A case study analysis of severe precipitation event in Athens, Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis; Nastos, Panagiotis; Avgoustoglou, Euripides; Gofa, Flora; Pytharoulis, Ioannis; Kamberakis, Nikolaos

    2016-04-01

    An early warning process is the result of interplay between the forecasting and nowcasting interactions. Therefore, (1) an accurate measurement and prediction of the spatial and temporal distribution of rainfall over an area and (2) the efficient and appropriate description of the catchment properties are important issues in atmospheric hazards (severe precipitation, flood, flash flood, etc.). In this paper, a forecasting and nowcasting analysis is presented, regarding a severe precipitation event that took place on September 21, 2015 in Athens, Greece. The severe precipitation caused a flash flood event at the suburbs of Athens, with significant impacts to the local society. Quantitative precipitation forecasts from European Centre for Medium-Range Weather Forecasts and from the COSMO.GR atmospheric model, including ensemble forecast of precipitation and probabilistic approaches are analyzed as tools in forecasting process. Satellite remote sensing data close and six hours prior to flash flood are presented, accompanied with radar products from Hellenic National Meteorological Service, illustrating the ability to depict the convection process.

  17. Defining adverse events in manual therapy: an exploratory qualitative analysis of the patient perspective.

    PubMed

    Carlesso, Lisa C; Cairney, John; Dolovich, Lisa; Hoogenes, Jennifer

    2011-10-01

    Rare, serious, and common, benign adverse events (AE) are associated with MT techniques. A proposed standard for defining AE in manual therapy (MT) practise has been published but it did not include the patient perspective. Research comparing clinician and patient reporting of AE demonstrates that several differences exist; for example, the reporting of objective versus subjective events. The objective of this study was to describe how patients define AE associated with MT techniques. A descriptive qualitative design was employed. Semi-structured interviews were used with a purposive sample of patients (n = 13) receiving MT, from physiotherapy, chiropractic and osteopathic practises in Ontario, Canada. The interview guide was informed by existing evidence and consultation with content and methodological experts. Interviews were audiotaped and transcribed verbatim. Date were analysed by two independent team members using thematic content analysis. A key finding was that patients defined mild, moderate and major AE by pain/symptom severity, functional impact, duration and by ruling out of alternative causes. An overarching theme identified multiple factors that influence how the AE is perceived. These concepts differ from the previously proposed framework for defining AE that did not include the patient perspective. Future processes to create standard definitions or measures should include the patient viewpoint to provide a broader, client-centred foundation.

  18. OBSERVATIONS AND ANALYSIS OF MUTUAL EVENTS BETWEEN THE URANUS MAIN SATELLITES

    SciTech Connect

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; Da Silva Neto, D. N.; Andrei, A. H. E-mail: rvm@on.br

    2009-04-15

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s{sup -1}, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites.

  19. Observations and Analysis of Mutual Events between the Uranus Main Satellites

    NASA Astrophysics Data System (ADS)

    Assafin, M.; Vieira-Martins, R.; Braga-Ribas, F.; Camargo, J. I. B.; da Silva Neto, D. N.; Andrei, A. H.

    2009-04-01

    Every 42 years, the Earth and the Sun pass through the plane of the orbits of the main satellites of Uranus. In these occasions, mutual occultations and eclipses between these bodies can be seen from the Earth. The current Uranus equinox from 2007 to 2009 offers a precious opportunity to observe these events. Here, we present the analysis of five occultations and two eclipses observed from Brazil during 2007. For the reduction of the CCD images, we developed a digital coronagraphic method that removed the planet's scattered light around the satellites. A simple geometric model of the occultation/eclipse was used to fit the observed light curves. Dynamical quantities such as the impact parameter, the relative speed, and the central time of the event were then obtained with precisions of 7.6 km, 0.18 km s-1, and 2.9 s, respectively. These results can be further used to improve the parameters of the dynamical theories of the main Uranus satellites. Based on observations made at Laboratório Nacional de Astrofísica (LNA), Itajubá-MG, Brazil.

  20. Analysis of Loss-of-Offsite-Power Events 1998–2012

    SciTech Connect

    T. E. Wierman

    2013-10-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. Risk analyses performed loss of all alternating current power contributes over 70% of the overall risk at some U.S. nuclear plants. LOOP event and subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience from fiscal year 1998 through 2012. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. The EDG failure modes considered are failure to start, failure to load and run, and failure to run more than 1 hour. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. A statistically significant increase in industry performance was identified for plant-centered and switchyard-centered LOOP frequencies. There is no statistically significant trend in LOOP durations.

  1. Single-event analysis of the packaging of bacteriophage T7 DNA concatemers in vitro.

    PubMed

    Sun, M; Louie, D; Serwer, P

    1999-09-01

    Bacteriophage T7 packages its double-stranded DNA genome in a preformed protein capsid (procapsid). The DNA substrate for packaging is a head-to-tail multimer (concatemer) of the mature 40-kilobase pair genome. Mature genomes are cleaved from the concatemer during packaging. In the present study, fluorescence microscopy is used to observe T7 concatemeric DNA packaging at the level of a single (microscopic) event. Metabolism-dependent cleavage to form several fragments is observed when T7 concatemers are incubated in an extract of T7-infected Escherichia coli (in vitro). The following observations indicate that the fragment-producing metabolic event is DNA packaging: 1) most fragments have the hydrodynamic radius (R(H)) of bacteriophage particles (+/-3%) when R(H) is determined by analysis of Brownian motion; 2) the fragments also have the fluorescence intensity (I) of bacteriophage particles (+/-6%); 3) as a fragment forms, a progressive decrease occurs in both R(H) and I. The decrease in I follows a pattern expected for intracapsid steric restriction of 4',6-diamidino-2-phenylindole (DAPI) binding to packaged DNA. The observed in vitro packaging of a concatemer's genomes always occurs in a synchronized cluster. Therefore, the following hypothesis is proposed: the observed packaging of concatemer-associated T7 genomes is cooperative.

  2. Error Analysis of Satellite Precipitation-Driven Modeling of Complex Terrain Flood Events

    NASA Astrophysics Data System (ADS)

    Mei, Y.; Nikolopoulos, E. I.; Anagnostou, E. N.; Zoccatelli, D.; Borga, M., Sr.

    2015-12-01

    The error characteristics of satellite precipitation driven flood event simulations over mountainous basins are evaluated in this study for eight different global satellite products. A methodology is devised to match the observed records of the flood events with the corresponding satellite and reference rainfall and runoff simulations. The flood events are sorted according to flood type (i.e. rain flood and flash flood) and basin's antecedent conditions represented by the event's runoff-to-precipitation ratio. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin scale event properties (i.e. cumulative volume, timing and shape). Overall satellite-driven event runoff exhibits better error metrics than the satellite precipitation. Better error metrics are also shown for the rain flood events relative to the flash flood events. The event timing and shape from satellite-derived precipitation agreed well with the reference; the cumulative volume is mostly underestimated. In terms of error propagation, the study shows dampening effect in both systematic and random error components of the satellite-driven runoff time series relative to the satellite-retrieved event precipitation. This error dampening effect is less pronounced for the flash flood events and the rain flood events with high runoff coefficients. This study provides for a first time flood event characteristics of the satellite precipitation error propagation in flood modeling, which has implications on the Global Precipitation Measurement application in mountain flood hydrology.

  3. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  4. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  5. Robust analysis of event-related functional magnetic resonance imaging data using independent component analysis

    NASA Astrophysics Data System (ADS)

    Kadah, Yasser M.

    2002-04-01

    We propose a technique that enables robust use of blind source separation techniques in fMRI data analysis. The fMRI temporal signal is modeled as the summation of the true activation signal, a physiological baseline fluctuation component, and a random noise component. A preprocessing denoising is used to reduce the dimensionality of the random noise component in this mixture before applying the principal/independent component analysis (PCA/ICA) methods. The set of denoised time courses from a localized region are utilized to capture the region-specific activation patterns. We show a significant improvement in the convergence properties of the ICA iteration when the denoised time courses are used. We also demonstrate the advantage of using ICA over PCA to separate components due to physiological signals from those corresponding to actual activation. Moreover, we propose the use of ICA to analyze the magnitude of the Fourier domain of the time courses. This allows ICA to group signals with similar patterns and different delays together, which makes the iteration even more efficient. The proposed technique is verified using computer simulations as well as actual data from a healthy human volunteer. The results confirm the robustness of the new strategy and demonstrate its value for clinical use.

  6. Event-related EEG time-frequency analysis and the Orienting Reflex to auditory stimuli.

    PubMed

    Barry, Robert J; Steiner, Genevieve Z; De Blasio, Frances M

    2012-06-01

    Sokolov's classic works discussed electroencephalogram (EEG) alpha desynchronization as a measure of the Orienting Reflex (OR). Early studies confirmed that this reduced with repeated auditory stimulation, but without reliable stimulus-significance effects. We presented an auditory habituation series with counterbalanced indifferent and significant (counting) instructions. Time-frequency analysis of electrooculogram (EOG)-corrected EEG was used to explore prestimulus levels and the timing and amplitude of event-related increases and decreases in 4 classic EEG bands. Decrement over trials and response recovery were substantial for the transient increase (in delta, theta, and alpha) and subsequent desynchronization (in theta, alpha, and beta). There was little evidence of dishabituation and few effects of counting. Expected effects in stimulus-induced alpha desynchronization were confirmed. Two EEG response patterns over trials and conditions, distinct from the full OR pattern, warrant further research.

  7. Efficacy of forensic statement analysis in distinguishing truthful from deceptive eyewitness accounts of highly stressful events.

    PubMed

    Morgan, Charles A; Colwell, Kevin; Hazlett, Gary A

    2011-09-01

    Laboratory-based detecting deception research suggests that truthful statements differ from those of deceptive statements. This nonlaboratory study tested whether forensic statement analysis (FSA) methods would distinguish genuine from false eyewitness accounts about exposure to a highly stressful event. A total of 35 military participants were assigned to truthful or deceptive eyewitness conditions. Genuine eyewitness reported truthfully about exposure to interrogation stress. Deceptive eyewitnesses studied transcripts of genuine eyewitnesses for 24 h and falsely claimed they had been interrogated. Cognitive Interviews were recorded, transcribed, and assessed by FSA raters blind to the status of participants. Genuine accounts contained more unique words, external and contextual referents, and a greater total word count than did deceptive statements. The type-token ratio was lower in genuine statements. The classification accuracy using FSA techniques was 82%. FSA methods may be effective in real-world circumstances and have relevance to professionals in law enforcement, security, and criminal justice.

  8. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    SciTech Connect

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  9. Dynamics of the 1054 UT March 22, 1979, substorm event - CDAW 6. [Coordinated Data Analysis Workshop

    NASA Technical Reports Server (NTRS)

    Mcpherron, R. L.; Manka, R. H.

    1985-01-01

    The Coordinated Data Analysis Workshop (CDAW 6) has the primary objective to trace the flow of energy from the solar wind through the magnetosphere to its ultimate dissipation in the ionosphere. An essential role in this energy transfer is played by magnetospheric substorms, however, details are not yet completely understood. The International Magnetospheric Study (IMS) has provided an ideal data base for the study conducted by CDAW 6. The present investigation is concerned with the 1054 UT March 22, 1979, substorm event, which had been selected for detailed examination in connection with the studies performed by the CDAW 6. The observations of this substorm are discussed, taking into account solar wind conditions, ground magnetic activity on March 22, 1979, observations at synchronous orbit, observations in the near geomagnetic tail, and the onset of the 1054 UT expansion phase. Substorm development and magnetospheric dynamics are discussed on the basis of a synthesis of the observations.

  10. Localization of the event-related potential novelty response as defined by principal components analysis.

    PubMed

    Dien, Joseph; Spencer, Kevin M; Donchin, Emanuel

    2003-10-01

    Recent research indicates that novel stimuli elicit at least two distinct components, the Novelty P3 and the P300. The P300 is thought to be elicited when a context updating mechanism is activated by a wide class of deviant events. The functional significance of the Novelty P3 is uncertain. Identification of the generator sources of the two components could provide additional information about their functional significance. Previous localization efforts have yielded conflicting results. The present report demonstrates that the use of principal components analysis (PCA) results in better convergence with knowledge about functional neuroanatomy than did previous localization efforts. The results are also more convincing than that obtained by two alternative methods, MUSIC-RAP and the Minimum Norm. Source modeling on 129-channel data with BESA and BrainVoyager suggests the P300 has sources in the temporal-parietal junction whereas the Novelty P3 has sources in the anterior cingulate.

  11. System-level analysis of single event upset susceptibility in RRAM architectures

    NASA Astrophysics Data System (ADS)

    Liu, Rui; Barnaby, Hugh J.; Yu, Shimeng

    2016-12-01

    In this work, the single event upset susceptibility of a resistive random access memory (RRAM) system with 1-transistor-1-resistor (1T1R) and crossbar architectures to heavy ion strikes is investigated from the circuit-level to the system-level. From a circuit-level perspective, the 1T1R is only susceptible to single-bit-upset (SBU) due to the isolation of cells, while in the crossbar, multiple-bit-upsets may occur because ion-induced voltage spikes generated on drivers may propagate along rows or columns. Three factors are considered to evaluate system-level susceptibility: the upset rate, the sensitive area, and the vulnerable time window. Our analysis indicates that the crossbar architecture has a smaller maximum bit-error-rate per day as compared to the 1T1R architecture for a given sub-array size, I/O width and susceptible time window.

  12. Time-series analysis for rapid event-related skin conductance responses

    PubMed Central

    Bach, Dominik R.; Flandin, Guillaume; Friston, Karl J.; Dolan, Raymond J.

    2009-01-01

    Event-related skin conductance responses (SCRs) are traditionally analysed by comparing the amplitude of individual peaks against a pre-stimulus baseline. Many experimental manipulations in cognitive neuroscience dictate paradigms with short inter trial intervals, precluding accurate baseline estimation for SCR measurements. Here, we present a novel and general approach to SCR analysis, derived from methods used in neuroimaging that estimate responses using a linear convolution model. In effect, the method obviates peak-scoring and makes use of the full SCR. We demonstrate, across three experiments, that the method has face validity in analysing reactions to a loud white noise and emotional pictures, can be generalised to paradigms where the shape of the response function is unknown and can account for parametric trial-by-trial effects. We suggest our approach provides greater flexibility in analysing SCRs than existing methods. PMID:19686778

  13. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    SciTech Connect

    Anderson, Johan; Halpern, Federico D.; Ricci, Paolo; Furno, Ivo; Xanthopoulos, Pavlos

    2014-12-15

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis of the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.

  14. Uncertainty Analysis for a De-pressurised Loss of Forced Cooling Event of the PBMR Reactor

    SciTech Connect

    Jansen van Rensburg, Pieter A.; Sage, Martin G.

    2006-07-01

    This paper presents an uncertainty analysis for a De-pressurised Loss of Forced Cooling (DLOFC) event that was performed with the systems CFD (Computational Fluid Dynamics) code Flownex for the PBMR reactor. An uncertainty analysis was performed to determine the variation in maximum fuel, core barrel and reactor pressure vessel (RPV) temperature due to variations in model input parameters. Some of the input parameters that were varied are: thermo-physical properties of helium and the various solid materials, decay heat, neutron and gamma heating, pebble bed pressure loss, pebble bed Nusselt number and pebble bed bypass flows. The Flownex model of the PBMR reactor is a 2-dimensional axisymmetrical model. It is simplified in terms of geometry and some other input values. However, it is believed that the model adequately indicates the effect of changes in certain input parameters on the fuel temperature and other components during a DLOFC event. Firstly, a sensitivity study was performed where input variables were varied individually according to predefined uncertainty ranges and the results were sorted according to the effect on maximum fuel temperature. In the sensitivity study, only seven variables had a significant effect on the maximum fuel temperature (greater that 5 deg. C). The most significant are power distribution profile, decay heat, reflector properties and effective pebble bed conductivity. Secondly, Monte Carlo analyses were performed in which twenty variables were varied simultaneously within predefined uncertainty ranges. For a one-tailed 95% confidence level, the conservatism that should be added to the best estimate calculation of the maximum fuel temperature for a DLOFC was determined as 53 deg. C. This value will probably increase after some model refinements in the future. Flownex was found to be a valuable tool for uncertainly analyses, facilitating both sensitivity studies and Monte Carlo analyses. (authors)

  15. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  16. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database

    PubMed Central

    Hosoya, Ryuichiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher’s exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the

  17. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains.

    PubMed

    Torre, Emiliano; Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz; Grün, Sonja

    2016-07-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity.

  18. Analysis and Prediction of West African Moist Events during the Boreal Spring of 2009

    NASA Astrophysics Data System (ADS)

    Mera, Roberto Javier

    Weather and climate in Sahelian West Africa are dominated by two major wind systems, the southwesterly West African Monsoon (WAM) and the northeasterly (Harmattan) trade winds. In addition to the agricultural benefit of the WAM, the public health sector is affected given the relationship between the onset of moisture and end of meningitis outbreaks. Knowledge and prediction of moisture distribution during the boreal spring is vital to the mitigation of meningitis by providing guidance for vaccine dissemination. The goal of the present study is to (a) develop a climatology and conceptual model of the moisture regime during the boreal spring, (b) investigate the role of extra-tropical and Convectively-coupled Equatorial Waves (CCEWs) on the modulation of westward moving synoptic waves and (c) determine the efficacy of a regional model as a tool for predicting moisture variability. Medical reports during 2009, along with continuous meteorological observations at Kano, Nigeria, showed that the advent of high humidity correlated with cessation of the disease. Further analysis of the 2009 boreal spring elucidated the presence of short-term moist events that modulated surface moisture on temporal scales relevant to the health sector. The May moist event (MME) provided insight into interplays among climate anomalies, extra-tropical systems, equatorially trapped waves and westward-propagating synoptic disturbances. The synoptic disturbance initiated 7 May and traveled westward to the coast by 12 May. There was a marked, semi-stationary moist anomaly in the precipitable water field (kg m-2) east of 10°E through late April and early May, that moved westward at the time of the MME. Further inspection revealed a mid-latitude system may have played a role in increasing the latitudinal amplitude of the MME. CCEWs were also found to have an impact on the MME. A coherent Kelvin wave propagated through the region, providing increased monsoonal flow and heightened convection. A

  19. Potential of Breastmilk Analysis to Inform Early Events in Breast Carcinogenesis: Rationale and Considerations

    PubMed Central

    Murphy, Jeanne; Sherman, Mark E.; Browne, Eva P.; Caballero, Ana I.; Punska, Elizabeth C.; Pfeiffer, Ruth M.; Yang, Hannah P.; Lee, Maxwell; Yang, Howard; Gierach, Gretchen L.; Arcaro, Kathleen F.

    2016-01-01

    This review summarizes methods related to the study of human breastmilk in etiologic and biomarkers research. Despite the importance of reproductive factors in breast carcinogenesis, factors that act early in life are difficult to study because young women rarely require breast imaging or biopsy, and analysis of critical circulating factors (e.g. hormones) is often complicated by the requirement to accurately account for menstrual cycle date. Accordingly, novel approaches are needed to understand how events such as pregnancy, breastfeeding, weaning, and post-weaning breast remodeling influence breast cancer risk. Analysis of breastmilk offers opportunities to understand mechanisms related to carcinogenesis in the breast, and to identify risk markers that may inform efforts to identify high-risk women early in the carcinogenic process. In addition, analysis of breastmilk could have value in early detection or diagnosis of breast cancer. In this article we describe the potential for using breastmilk to characterize the microenvironment of the lactating breast with the goal of advancing research on risk assessment, prevention, and detection of breast cancer. PMID:27107568

  20. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    SciTech Connect

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.; Sandia National Labs., Albuquerque, NM )

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  1. Top-down and bottom-up definitions of human failure events in human reliability analysis

    SciTech Connect

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  2. Frequency analysis and its spatiotemporal characteristics of precipitation extreme events in China during 1951-2010

    NASA Astrophysics Data System (ADS)

    Shao, Yuehong; Wu, Junmei; Ye, Jinyin; Liu, Yonghe

    2015-08-01

    This study investigates frequency analysis and its spatiotemporal characteristics of precipitation extremes based on annual maximum of daily precipitation (AMP) data of 753 observation stations in China during the period 1951-2010. Several statistical methods including L-moments, Mann-Kendall test (MK test), Student's t test ( t test) and analysis of variance ( F-test) are used to study different statistical properties related to frequency and spatiotemporal characteristics of precipitation extremes. The results indicate that the AMP series of most sites have no linear trends at 90 % confidence level, but there is a distinctive decrease trend in Beijing-Tianjin-Tangshan region. The analysis of abrupt changes shows that there are no significant changes in most sites, and no distinctive regional patterns within the mutation sites either. An important innovation different from the previous studies is the shift in the mean and the variance which are also studied in this paper in order to further analyze the changes of strong and weak precipitation extreme events. The shift analysis shows that we should pay more attention to the drought in North China and to the flood control and drought in South China, especially to those regions that have no clear trend and have a significant shift in the variance. More important, this study conducts the comprehensive analysis of a complete set of quantile estimates and its spatiotemporal characteristic in China. Spatial distribution of quantile estimation based on the AMP series demonstrated that the values gradually increased from the Northwest to the Southeast with the increment of duration and return period, while the increasing rate of estimation is smooth in the arid and semiarid region and is rapid in humid region. Frequency estimates of 50-year return period are in agreement with the maximum observations of AMP series in the most stations, which can provide more quantitative and scientific basis for decision making.

  3. Climate change impacts on extreme events in the United States: an uncertainty analysis

    EPA Science Inventory

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  4. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    SciTech Connect

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. ); Baxter, J.T. ); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. ); Brosseau, D.A. )

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  5. Rain-on-snow Events in Southwestern British Columbia: A Long-term Analysis of Meteorological Conditions and Snowpack Response

    NASA Astrophysics Data System (ADS)

    Trubilowicz, J. W.; Moore, D.

    2015-12-01

    Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.

  6. Exact meta-analysis approach for discrete data and its application to 2 × 2 tables with rare events

    PubMed Central

    Liu, Dungang; Liu, Regina Y.

    2014-01-01

    This paper proposes a general exact meta-analysis approach for synthesizing inferences from multiple studies of discrete data. The approach combines the p-value functions (also known as significance functions) associated with the exact tests from individual studies. It encompasses a broad class of exact meta-analysis methods, as it permits broad choices for the combining elements, such as tests used in individual studies, and any parameter of interest. The approach yields statements that explicitly account for the impact of individual studies on the overall inference, in terms of efficiency/power and the type I error rate. Those statements also give rises to empirical methods for further enhancing the combined inference. Although the proposed approach is for general discrete settings, for convenience, it is illustrated throughout using the setting of meta-analysis of multiple 2 × 2 tables. In the context of rare events data, such as observing few, zero or zero total (i.e., zero events in both arms) outcomes in binomial trials or 2 × 2 tables, most existing meta-analysis methods rely on the large-sample approximations which may yield invalid inference. The commonly used corrections to zero outcomes in rare events data, aiming to improve numerical performance can also incur undesirable consequences. The proposed approach applies readily to any rare event setting, including even the zero total event studies without any artificial correction. While debates continue on whether or how zero total event studies should be incorporated in meta-analysis, the proposed approach has the advantage of automatically including those studies and thus making use of all available data. Through numerical studies in rare events settings, the proposed exact approach is shown to be efficient and, generally, outperform commonly used meta-analysis methods, including Mental-Haenszel and Peto methods. PMID:25620825

  7. Diagnostic evaluation of distributed physically based model at the REW scale (THREW) using rainfall-runoff event analysis

    NASA Astrophysics Data System (ADS)

    Tian, F.; Sivapalan, M.; Li, H.; Hu, H.

    2007-12-01

    The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of

  8. Retrospective Analysis of Recent Flood Events With Persistent High Surface Runoff From Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Joshi, S.; Hakeem, K. Abdul; Raju, P. V.; Rao, V. V.; Yadav, A.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    /locations with probable flooding conditions. These thresholds were refined through iterative process by comparing with satellite data derived flood maps of 2013 and 2014 monsoon season over India. India encountered many cyclonic flood events during Oct-Dec 2013, among which Phailin, Lehar, and Madi were rated to be very severe cyclonic storm. The path and intensity of these cyclonic events was very well captured by the model and areas were marked with persistent coverage of high runoff risk/flooded area. These thresholds were used to monitor floods in Jammu Kashmir during 4-5 Sep and Odisha during 8-9 Aug, 2014. The analysis indicated the need to vary the thresholds across space considering the terrain and geographical conditions. With respect to this a sub-basin wise study was made based on terrain characteristics (slope, elevation) using Aster DEM. It was found that basins with higher elevation represent higher thresholds as compared to basins with lesser elevation. The results show very promising correlation with the satellite derived flood maps. Further refinement and optimization of thresholds, varying them spatially accounting for topographic/terrain conditions, would lead to estimation of high runoff/flood risk areas for both riverine and drainage congested areas. Use of weather forecast data (NCMWRF, (GEFS/R)), etc. would enhance the scope to develop early warning systems.

  9. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    NASA Astrophysics Data System (ADS)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  10. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    SciTech Connect

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  11. 2005 Caribbean mass coral bleaching event: A sea surface temperature empirical orthogonal teleconnection analysis

    NASA Astrophysics Data System (ADS)

    Simonti, Alicia L.; Eastman, J. Ronald

    2010-11-01

    This study examined the effects of climate teleconnections on the massive Caribbean coral bleaching and mortality event of 2005. A relatively new analytical procedure known as empirical orthogonal teleconnection (EOT) analysis, based on a 26 year monthly time series of observed sea surface temperature (SST), was employed. Multiple regression analysis was then utilized to determine the relative teleconnection contributions to SST variability in the southern Caribbean. The results indicate that three independent climate teleconnections had significant impact on southern Caribbean anomalies in SST and that their interaction was a major contributor to the anomalously high temperatures in 2005. The primary and approximately equal contributors were EOT-5 and EOT-2, which correlate most strongly with the tropical North Atlantic (TNA) and Atlantic multidecadal oscillation (AMO) climate indices, respectively. The third, EOT-9, was most strongly related to the Atlantic meridional mode. However, although statistically significant, the magnitude of its contribution to southern Caribbean variability was small. While there is debate over the degree to which the recent AMO pattern represents natural variability or global ocean warming, the results presented here indicate that natural variability played a strong role in the 2005 coral bleaching conditions. They also argue for a redefinition of the geography of TNA variability.

  12. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  13. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    SciTech Connect

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  14. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves

    NASA Astrophysics Data System (ADS)

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-01

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f α noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  15. Fractal analysis of GPS time series for early detection of disastrous seismic events

    NASA Astrophysics Data System (ADS)

    Filatov, Denis M.; Lyubushin, Alexey A.

    2017-03-01

    A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.

  16. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  17. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves.

    PubMed

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-22

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  18. ANTARES: The Arizona-NOAO Temporal Analysis and Response to Events System

    NASA Astrophysics Data System (ADS)

    Matheson, T.; Saha, A.; Snodgrass, R.; Kececioglu, J.

    The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. The goal is to build the software infrastructure necessary to process and filter alerts produced by time-domain surveys, with the ultimate source of such alerts being the Large Synoptic Survey Telescope (LSST). ANTARES will add value to alerts by annotating them with information from external sources such as previous surveys from across the electromagnetic spectrum. In addition, the temporal history of annotated alerts will provide further annotation for analysis. These alerts will go through a cascade of filters to select interesting candidates. For the prototype, 'interesting' is defined as the rarest or most unusual alert, but future systems will accommodate multiple filtering goals. The system is designed to be flexible, allowing users to access the stream at multiple points throughout the process, and to insert custom filters where necessary. We will describe the basic architecture of ANTARES and the principles that will guide development and implementation.

  19. Motion based markerless gait analysis using standard events of gait and ensemble Kalman filtering.

    PubMed

    Vishnoi, Nalini; Mitra, Anish; Duric, Zoran; Gerber, Naomi Lynn

    2014-01-01

    We present a novel approach to gait analysis using ensemble Kalman filtering which permits markerless determination of segmental movement. We use image flow analysis to reliably compute temporal and kinematic measures including the translational velocity of the torso and rotational velocities of the lower leg segments. Detecting the instances where velocity changes direction also determines the standard events of a gait cycle (double-support, toe-off, mid-swing and heel-strike). In order to determine the kinematics of lower limbs, we model the synergies between the lower limb motions (thigh-shank, shank-foot) by building a nonlinear dynamical system using CMUs 3D motion capture database. This information is fed into the ensemble Kalman Filter framework to estimate the unobserved limb (upper leg and foot) motion from the measured lower leg rotational velocity. Our approach does not require calibrated cameras or special markers to capture movement. We have tested our method on different gait sequences collected from the sagttal plane and presented the estimated kinematics overlaid on the original image frames. We have also validated our approach by manually labeling the videos and comparing our results against them.

  20. Metamizole-Associated Adverse Events: A Systematic Review and Meta-Analysis

    PubMed Central

    Fässler, Margrit; Blozik, Eva; Linde, Klaus; Jüni, Peter; Reichenbach, Stephan; Scherer, Martin

    2015-01-01

    Background Metamizole is used to treat pain in many parts of the world. Information on the safety profile of metamizole is scarce; no conclusive summary of the literature exists. Objective To determine whether metamizole is clinically safe compared to placebo and other analgesics. Methods We searched CENTRAL, MEDLINE, EMBASE, CINAHL, and several clinical trial registries. We screened the reference lists of included trials and previous systematic reviews. We included randomized controlled trials that compared the effects of metamizole, administered to adults in any form and for any indication, to other analgesics or to placebo. Two authors extracted data regarding trial design and size, indications for pain medication, patient characteristics, treatment regimens, and methodological characteristics. Adverse events (AEs), serious adverse events (SAEs), and dropouts were assessed. We conducted separate meta-analyses for each metamizole comparator, using standard inverse-variance random effects meta-analysis to pool the estimates across trials, reported as risk ratios (RRs). We calculated the DerSimonian and Laird variance estimate T2 to measure heterogeneity between trials. The pre-specified primary end point was any AE during the trial period. Results Of the 696 potentially eligible trials, 79 trials including almost 4000 patients with short-term metamizole use of less than two weeks met our inclusion criteria. Fewer AEs were reported for metamizole compared to opioids, RR = 0.79 (confidence interval 0.79 to 0.96). We found no differences between metamizole and placebo, paracetamol and NSAIDs. Only a few SAEs were reported, with no difference between metamizole and other analgesics. No agranulocytosis or deaths were reported. Our results were limited by the mediocre overall quality of the reports. Conclusion For short-term use in the hospital setting, metamizole seems to be a safe choice when compared to other widely used analgesics. High-quality, adequately sized

  1. Analysis of an extremely dense regional fog event in Eastern China using a mesoscale model

    NASA Astrophysics Data System (ADS)

    Shi, Chune; Yang, Jun; Qiu, Mingyan; Zhang, Hao; Zhang, Su; Li, Zihua

    2010-03-01

    An unusually dense regional advection-radiation fog event over Anhui and the surrounding provinces in eastern China during Dec. 25-27, 2006, was investigated. At its mature stage, the fog covered most Anhui and parts of the surrounding provinces, reducing visibility to 100 m or less. It lasted more than 36 consecutive hours in some places. A mesoscale meteorological model (MM5), together with back-trajectory analysis, was used to investigate this fog event. The observations from a field station as well as hundreds of routine stations, along with two sets of visibility computing methods, were used to quantitatively and objectively validate the MM5 simulated liquid water content (LWC) and visibility. The verifications demonstrate that MM5 has a better fog predictability for the first day compared to the second day forecast, and better fog predictability compared to dense fog predictability with regard to the probability of detection (POD) and the threat score (TS). The new visibility algorithm that uses both LWC and number density of fog droplets significantly outperforms the conventional LWC-only based one in the fog prediction in terms of the POD score, especially for dense fog prediction. The objective verification in this work is the first time conducted for MM5 fog prediction, with which we can better understand the performance of simulated temporal and spatial fog coverage. The back-trajectory and sensitivity experiments confirm that subsidence and the steady warm and moist advections from southeast and southwest maintained the dense fog while the northwesterly dry wind resulted in dissipation of the fog.

  2. Integrated survival analysis using an event-time approach in a Bayesian framework

    USGS Publications Warehouse

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  3. Integrated survival analysis using an event-time approach in a Bayesian framework

    PubMed Central

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Effects of Tube Rupture Modeling and Parameters on Analysis of MSGTR Event Progression in PWR

    SciTech Connect

    Jeong, Ji Hwan; Choi, Ki Yong; Chang, Keun Sun; Kweon, Young Chel

    2002-07-01

    A multiple steam generator tube rupture (MSGTR) event in APR1400 has been investigated using the best estimate thermal hydraulic system code, MARS1.4. The effects of parameters such as the number of ruptured tubes, rupture location, affected steam generator on analysis of the MSGTR event in APR1400 is examined. In particular, tube rupture modeling methods, single tube modeling (STM) and double tube modeling (DTM), are compared. When five tubes are ruptured, the STM predicts the operator response time of 2085 seconds before main steam safety valves (MSSVs) are lifted. The effects of rupture location on the MSSV lift time is not significant in case of STM, but the MSSV lift time for tube-top rupture is found to be 25.3% larger than that for rupture at hog-leg side tube sheet in case of DTM. The MSSV lift time for the cases that both steam generators are affected (4C5x, 4C23x) are found to be larger than that of the single steam generator cases (4A5x, 4B5x) due to a bifurcation of the primary leak flow. The discharge coefficient of Cd is found to affect the MSSV lift time only for smaller value of 0.5. It is found that the most dominant parameter governing the MSSV lift time is the leak flow rate. Whether any modeling method is used, it gives the similar MSSV lift time if the leak flow rate is close, except the case of both steam generators are affected. Therefore, the system performance and the MSSV lift time of the APR1400 are strongly dependent on the break flow model used in the best estimate system code. (authors)

  5. 'HESPERIA' HORIZON 2020 project: High Energy Solar Particle Events foRecastIng and Analysis

    NASA Astrophysics Data System (ADS)

    Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma; Bindi, Veronica; Murphy, Ronald; Tyka, Allan J.; Rodriguez, Juan

    2016-04-01

    Solar energetic particles (SEPs) are of prime interest for fundamental astrophysics. However, due to their high energies they are a space weather concern for technology in space as well as human space exploration calling for reliable tools with predictive capabilities. The two-year EU HORIZON 2020 project HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis, http://www.hesperia-space.eu/) will produce two novel operational SEP forecasting tools based upon proven concepts (UMASEP, REleASE). At the same time the project will advance our understanding of the physical mechanisms that result into high-energy SEP events through the systematic exploitation of the high-energy gamma-ray observations of the FERMI mission and other novel published datasets (PAMELA, AMS), together with in situ SEP measurements near 1 AU. By using multi-frequency observations and performing simulations, the project will address the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to their detection near 1 AU. Furthermore, HESPERIA will explore the possibility of incorporating the derived results into future innovative space weather services. Publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters, giving information on the high-energy processes occurring at or near the Sun during solar eruptions, will be provided for the first time. The results of this inversion software will complement the space-borne measurements at adjacent higher energies. In order to achieve these goals HESPERIA will exploit already existing large datasets that are stored into databases built under EU FP7 projects NMDB and SEPServer. The structure of the HESPERIA project, its main objectives and forecasting operational tools, as well as the added value to SEP research will be presented and discussed. Acknowledgement: This project has received funding from the

  6. Analysis and high-resolution modeling of a dense sea fog event over the Yellow Sea

    NASA Astrophysics Data System (ADS)

    Fu, Gang; Guo, Jingtian; Xie, Shang-Ping; Duan, Yihong; Zhang, Meigen

    2006-10-01

    A ubiquitous feature of the Yellow Sea (YS) is the frequent occurrence of the sea fog in spring and summer season. An extremely dense sea fog event was observed around the Shandong Peninsula in the morning of 11 April 2004. This fog patch, with a spatial scale of several hundreds kilometers and lasted about 20 h, reduced the horizontal visibility to be less than 20 m in some locations, and caused a series of traffic collisions and 12 injuries on the coastal stretch of a major highway. In this paper, almost all available observational data, including Geostationary Operational Environmental Satellite (GOES)-9 visible satellite imagery, objectively reanalyzed data of final run analysis (FNL) issued by the National Center for Environmental Prediction (NCEP) and the sounding data of Qingdao and Dalian, as well as the latest 4.4 version of Regional Atmospheric Modeling System (RAMS) model, were employed to investigate this sea fog case. Its evolutionary process and the environmental conditions that led to the fog formation were examined by using GOES-9 visible satellite imagery and sounding observations. In order to better understand the fog formation mechanism, a high-resolution RAMS modeling of 4 km × 4 km was designed. The modeling was initialized and validated by FNL data. A 30-h modeling that started from 18 UTC 10 April 2004 reproduced the main characteristics of this fog event. The simulated lower horizontal visibility area agreed reasonably well with the sea fog region identified from the satellite imagery. Advection cooling effect seemed to play a significant role for the fog formation.

  7. An analysis of high-impact, low-predictive skill severe weather events in the northeast U.S

    NASA Astrophysics Data System (ADS)

    Vaughan, Matthew T.

    An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather events with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill events between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather events with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill events are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill events. Type 1 events were found to occur in every year of the dataset with an average of 6 events per year. Type 2 events occur less frequently and are more common in the earlier half of the study period. An event-centered composite analysis is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather events with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact events with less than 31-kt shear have lower threat scores than high-impact events with higher shear values. Case study analysis of type 1 events suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.

  8. The ERP PCA Toolkit: an open source program for advanced statistical analysis of event-related potential data.

    PubMed

    Dien, Joseph

    2010-03-15

    This article presents an open source Matlab program, the ERP PCA (EP) Toolkit, for facilitating the multivariate decomposition and analysis of event-related potential data. This program is intended to supplement existing ERP analysis programs by providing functions for conducting artifact correction, robust averaging, referencing and baseline correction, data editing and visualization, principal components analysis, and robust inferential statistical analysis. This program subserves three major goals: (1) optimizing analysis of noisy data, such as clinical or developmental; (2) facilitating the multivariate decomposition of ERP data into its constituent components; (3) increasing the transparency of analysis operations by providing direct visualization of the corresponding waveforms.

  9. Analysis of the observed and forecast rainfall intensity structure in a precipitation event

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Molinié, Gilles; Karakasidis, Theodoros; Anquentin, Sandrine; Creutin, Jean Dominique; Pinty, Jean-Pierre; Escobar, Juan

    2014-05-01

    During the last decades a number of studies have been devoted to examine the precipitation field temporal and spatial structure, given the fact that rainfall exhibits large variability at all scales (see for example Ceresetti et al. 2011, 2012). The objective of this study is to examine the rainfall field structure at high temporal (15 minute) and spatial (1 km) resolution. We focus on rainfall properties such as the intermittency using the auto-correlation of precipitation time series to assess if it can be modelled assuming a fractal behaviour and considering different scales. Based on the results and methodology used in previous studies applied to observational precipitation data such as raingauge, weather radar and disdrometer observations (see for example Molinié et al., 2011, 2013), in this case we employ high resolution numerical forecast data. In particular our approach considers using a transitive covariogram, given the limited number of samples available in single precipitation events. Precipitation forecasts are derived at 15 minute intervals from 1-km grid length nested simulations of the non-hydrostatic mesoscale atmospheric model of the French research community Meso-NH, using AROME-WestMed model data as initial and boundary conditions. The analysis also considers existing data available in the Hymex (HYdrological cycle in the Mediterranean EXperiment) data base. Results are presented of a precipitation event that took place in the Rhône Valley (France) in November 2011. This case allows to study with the proposed methodology the effect of a number of factors (different orography along the Rhône Valley, turbulence, microphysical processes, etc.) on the observed and simulated precipitation field. References Ceresetti D., E. Ursu, J. Carreau, S. Anquetin, J. D. Creutin, L. Gardes, S. Girard, and G. Molinié, 2012: Evaluation of classical spatial-analysis schemes of extreme rainfall. Natural Hazards and Earth System Sciences, 12, 3229-3240, http

  10. An Internal Evaluation of the National FFA Agricultural Mechanics Career Development Event through Analysis of Individual and Team Scores from 1996-2006

    ERIC Educational Resources Information Center

    Franklin, Edward A.; Armbruster, James

    2012-01-01

    The purpose of this study was to conduct an internal evaluation of the National FFA Agricultural Mechanics Career Development Event (CDE) through analysis of individual and team scores from 1996-2006. Data were analyzed by overall and sub-event areas scores for individual contestants and team event. To facilitate the analysis process scores were…

  11. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  12. The analysis of the events of stellar visibility in Pliny's "Natural History"

    NASA Astrophysics Data System (ADS)

    Nickiforov, M. G.

    2016-07-01

    The Book XVIII of Pliny's "Natural History" contains about a hundred descriptions of the events of stellar visibility, which were used for the needs of agricultural calendar. The comparison between the calculated date of each event and the date given by Pliny shows that actual events of stellar visibility occurred systematically about ~10 days later with respect to the specified time. This discrepancy cannot be explained by errors of the calendar.

  13. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam

    2016-04-01

    Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic events. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an analysis of post-blasting events using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on underground seismic network. We show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC) component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for common strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  14. Dealing With Major Life Events and Transitions: A Systematic Literature Review on and Occupational Analysis of Spirituality.

    PubMed

    Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter

    2016-01-01

    This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life event or transition. The researchers investigated studies that used narrative analysis or a phenomenological methodology related to the topic. Thematic analysis resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life event or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention.

  15. Assessment of Adverse Events in Protocols, Clinical Study Reports, and Published Papers of Trials of Orlistat: A Document Analysis

    PubMed Central

    Schroll, Jeppe Bennekou; Penninga, Elisabeth I.; Gøtzsche, Peter C.

    2016-01-01

    filters, though six of seven papers stated that “all adverse events were recorded.” For one trial, we identified an additional 1,318 adverse events that were not listed or mentioned in the CSR itself but could be identified through manually counting individual adverse events reported in an appendix. We discovered that the majority of patients had multiple episodes of the same adverse event that were only counted once, though this was not described in the CSRs. We also discovered that participants treated with orlistat experienced twice as many days with adverse events as participants treated with placebo (22.7 d versus 14.9 d, p-value < 0.0001, Student’s t test). Furthermore, compared with the placebo group, adverse events in the orlistat group were more severe. None of this was stated in the CSR or in the published paper. Our analysis was restricted to one drug tested in the mid-1990s; our results might therefore not be applicable for newer drugs. Conclusions In the orlistat trials, we identified important disparities in the reporting of adverse events between protocols, clinical study reports, and published papers. Reports of these trials seemed to have systematically understated adverse events. Based on these findings, systematic reviews of drugs might be improved by including protocols and CSRs in addition to published articles. PMID:27529343

  16. LINEBACKER: LINE-speed Bio-inspired Analysis and Characterization for Event Recognition

    SciTech Connect

    Oehmen, Christopher S.; Bruillard, Paul J.; Matzke, Brett D.; Phillips, Aaron R.; Star, Keith T.; Jensen, Jeffrey L.; Nordwall, Douglas J.; Thompson, Seth R.; Peterson, Elena S.

    2016-08-04

    The cyber world is a complex domain, with digital systems mediating a wide spectrum of human and machine behaviors. While this is enabling a revolution in the way humans interact with each other and data, it also is exposing previously unreachable infrastructure to a worldwide set of actors. Existing solutions for intrusion detection and prevention that are signature-focused typically seek to detect anomalous and/or malicious activity for the sake of preventing or mitigating negative impacts. But a growing interest in behavior-based detection is driving new forms of analysis that move the emphasis from static indicators (e.g. rule-based alarms or tripwires) to behavioral indicators that accommodate a wider contextual perspective. Similar to cyber systems, biosystems have always existed in resource-constrained hostile environments where behaviors are tuned by context. So we look to biosystems as an inspiration for addressing behavior-based cyber challenges. In this paper, we introduce LINEBACKER, a behavior-model based approach to recognizing anomalous events in network traffic and present the design of this approach of bio-inspired and statistical models working in tandem to produce individualized alerting for a collection of systems. Preliminary results of these models operating on historic data are presented along with a plugin to support real-world cyber operations.

  17. Men’s and women’s migration in coastal Ghana: An event history analysis

    PubMed Central

    Reed, Holly E.; Andrzejewski, Catherine S.; White, Michael J.

    2013-01-01

    This article uses life history calendar (LHC) data from coastal Ghana and event history statistical methods to examine inter-regional migration for men and women, focusing on four specific migration types: rural-urban, rural-rural, urban-urban, and urban-rural. Our analysis is unique because it examines how key determinants of migration— including education, employment, marital status, and childbearing—differ by sex for these four types of migration. We find that women are significantly less mobile than men overall, but that more educated women are more likely to move (particularly to urban areas) than their male counterparts. Moreover, employment in the prior year is less of a deterrent to migration among women. While childbearing has a negative effect on migration, this impact is surprisingly stronger for men than for women, perhaps because women’s search for assistance in childcare promotes migration. Meanwhile, being married or in union appears to have little effect on migration probabilities for either men or women. These results demonstrate the benefits of a LHC approach and suggest that migration research should further examine men’s and women’s mobility as it relates to both human capital and household and family dynamics, particularly in developing settings. PMID:24298203

  18. Automated detection and analysis of depolarization events in human cardiomyocytes using MaDEC.

    PubMed

    Szymanska, Agnieszka F; Heylman, Christopher; Datta, Rupsa; Gratton, Enrico; Nenadic, Zoran

    2016-08-01

    Optical imaging-based methods for assessing the membrane electrophysiology of in vitro human cardiac cells allow for non-invasive temporal assessment of the effect of drugs and other stimuli. Automated methods for detecting and analyzing the depolarization events (DEs) in image-based data allow quantitative assessment of these different treatments. In this study, we use 2-photon microscopy of fluorescent voltage-sensitive dyes (VSDs) to capture the membrane voltage of actively beating human induced pluripotent stem cell-derived cardiomyocytes (hiPS-CMs). We built a custom and freely available Matlab software, called MaDEC, to detect, quantify, and compare DEs of hiPS-CMs treated with the β-adrenergic drugs, propranolol and isoproterenol. The efficacy of our software is quantified by comparing detection results against manual DE detection by expert analysts, and comparing DE analysis results to known drug-induced electrophysiological effects. The software accurately detected DEs with true positive rates of 98-100% and false positive rates of 1-2%, at signal-to-noise ratios (SNRs) of 5 and above. The MaDEC software was also able to distinguish control DEs from drug-treated DEs both immediately as well as 10min after drug administration.

  19. An Updated Meta-Analysis of Fatal Adverse Events Caused by Bevacizumab Therapy in Cancer Patients

    PubMed Central

    Zhu, Jianhong; Zhang, Jingjing; Chen, Huapu; Chen, Xinggui

    2014-01-01

    Background The risk of fatal adverse events (FAEs) due to bevacizumab-based chemotherapy has not been well described; we carried out an updated meta-analysis regarding this issue. Methods An electronic search of Medline, Embase and The Cochrane Central Register of Controlled Trials was conducted to investigate the effects of randomized controlled trials on bevacizumab treatment on cancer patients. Random or fixed-effect meta-analytical models were used to evaluate the risk ratio (RR) of FAEs due to the use of bevacizumab. Results Thirty-four trials were included. Allocation to bevacizumab therapy significantly increased the risk of FAEs; the RR was 1.29 (95% CI:1.05–1.57). This association varied significantly with tumor types (P = 0.002) and chemotherapeutic agents (P = 0.005) but not with bevacizumab dose (P = 0.90). Increased risk was seen in patients with non–small cell lung cancer, pancreatic cancer, prostate cancer, and ovarian cancer. However, FAEs were lower in breast cancer patients treated with bevacizumab. In addition, bevacizumab was associated with an increased risk of FAEs in patients who received concomitant agents of taxanes and/or platinum. Conclusion Compared with chemotherapy alone, the addition of bevacizumab was associated with an increased risk of FAEs among patients with special tumor types, particularly when combined with chemotherapeutic agents such as platinum. PMID:24599121

  20. Biochemical analysis of axon-specific phosphorylation events using isolated squid axoplasms.

    PubMed

    Kang, Minsu; Baker, Lisa; Song, Yuyu; Brady, Scott T; Morfini, Gerardo

    2016-01-01

    Appropriate functionality of nodes of Ranvier, presynaptic terminals, and other axonal subdomains depends on efficient and timely delivery of proteins synthesized and packaged into membrane-bound organelles (MBOs) within the neuronal cell body. MBOs are transported and delivered to their final sites of utilization within axons by a cellular process known as fast axonal transport (FAT). Conventional kinesin, the most abundant multisubunit motor protein expressed in mature neurons, is responsible for FAT of a large variety of MBOs and plays a major role in the maintenance of appropriate axonal connectivity. Consistent with the variety and large number of discrete subdomains within axons, experimental evidence revealed the identity of several protein kinases that modulate specific functional activities of conventional kinesin. Thus, methods for the analysis of kinase activity and conventional kinesin phosphorylation facilitate the study of FAT regulation in health and disease conditions. Axonal degeneration, abnormal patterns of protein phosphorylation, and deficits in FAT represent early pathological features characteristic of neurological diseases caused by unrelated neuropathogenic proteins. Interestingly, some of these proteins were shown to produce deficits in FAT by modulating the activity of specific protein kinases involved in conventional kinesin phosphorylation. However, experimental systems that facilitate an evaluation of molecular events within axons remain scarce. Using the isolated squid axoplasm preparation, we describe methods for evaluating axon-autonomous effects of neuropathogenic proteins on the activity of protein kinases. Protocols are also provided to evaluate the effect of such proteins on the phosphorylation of endogenous axonal substrates, including conventional kinesin and neurofilaments.

  1. Analysis of a vortex precipitation event over Southwest China using AIRS and in situ measurements

    NASA Astrophysics Data System (ADS)

    Ni, Chengcheng; Li, Guoping; Xiong, Xiaozhen

    2017-04-01

    A strong precipitation event caused by the southwest vortex (SWV), which affected Sichuan Province and Chongqing municipality in Southwest China on 10-14 July 2012, is investigated. The SWV is examined using satellite observations from AIRS (Atmospheric Infrared Sounder), in situ measurements from the SWV intensive observation campaign, and MICAPS (Marine Interactive Computer-Aided Provisioning System) data. Analysis of this precipitation process revealed that: (1) heavy rain occurred during the development phase, and cloud water content increased significantly after the dissipation of the SWV; (2) the area with low outgoing longwave radiation values from AIRS correlated well with the SWV; (3) variation of the temperature of brightness blackbody (TBB) from AIRS reflected the evolution of the SWV, and the values of TBB reduced significantly during the SWV's development; and (4) strong temperature and water vapor inversions were noted during the development of the SWV. The moisture profile displayed large vertical variation during the SWV's puissant phase, with the moisture inversion occurring at low levels. The moisture content during the receding phase was significantly reduced compared with that during the developing and puissant phases. The vertical flux of vapor divergence explained the variation of the moisture profile. These results also indicate the potential for using AIRS products in studying severe weather over the Tibetan Plateau and its surroundings, where in situ measurements are sparse.

  2. Big Data Mining and Adverse Event Pattern Analysis in Clinical Drug Trials

    PubMed Central

    Federer, Callie; Yoo, Minjae

    2016-01-01

    Abstract Drug adverse events (AEs) are a major health threat to patients seeking medical treatment and a significant barrier in drug discovery and development. AEs are now required to be submitted during clinical trials and can be extracted from ClinicalTrials.gov (https://clinicaltrials.gov/), a database of clinical studies around the world. By extracting drug and AE information from ClinicalTrials.gov and structuring it into a database, drug-AEs could be established for future drug development and repositioning. To our knowledge, current AE databases contain mainly U.S. Food and Drug Administration (FDA)-approved drugs. However, our database contains both FDA-approved and experimental compounds extracted from ClinicalTrials.gov. Our database contains 8,161 clinical trials of 3,102,675 patients and 713,103 reported AEs. We extracted the information from ClinicalTrials.gov using a set of python scripts, and then used regular expressions and a drug dictionary to process and structure relevant information into a relational database. We performed data mining and pattern analysis of drug-AEs in our database. Our database can serve as a tool to assist researchers to discover drug-AE relationships for developing, repositioning, and repurposing drugs. PMID:27631620

  3. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    NASA Astrophysics Data System (ADS)

    Aviyente, Selin; Bernat, Edward M.; Malone, Stephen M.; Iacono, William G.

    2010-12-01

    Joint time-frequency representations offer a rich representation of event related potentials (ERPs) that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP) approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  4. Role of Stratospheric Air in a Severe Weather Event: Analysis of Potential Vorticity and Total Ozone

    NASA Technical Reports Server (NTRS)

    Goering, Melissa A.; Gallus, William A., Jr.; Olsen, Mark A.; Stanford, John L.

    2001-01-01

    The role of dry stratospheric air descending to low and middle tropospheric levels in a severe weather outbreak in the midwestern United States is examined using ACCEPT Eta model output, Rapid Update Cycle (RUC) analyses, and Earth probe Total Ozone Mapping Spectrometer (EP/TOMS) total ozone data. While stratospheric air was not found to play a direct role in the convection, backward trajectories show stratospheric air descended to 800 hPa just west of the convection. Damaging surface winds not associated with thunderstorms also occurred in the region of greatest stratospheric descent. Small-scale features in the high-resolution total ozone data compare favorably with geopotential heights and potential vorticity fields, supporting the notion that stratospheric air descended to near the surface. A detailed vertical structure in the potential vorticity appears to be captured by small-scale total ozone variations. The capability of the total ozone to identify mesoscale features assists model verification. The total ozone data suggest biases in the RUC analysis and Eta forecast of this event. The total ozone is also useful in determining whether potential vorticity is of stratospheric origin or is diabatically generated in the troposphere.

  5. Big Data Mining and Adverse Event Pattern Analysis in Clinical Drug Trials.

    PubMed

    Federer, Callie; Yoo, Minjae; Tan, Aik Choon

    2016-12-01

    Drug adverse events (AEs) are a major health threat to patients seeking medical treatment and a significant barrier in drug discovery and development. AEs are now required to be submitted during clinical trials and can be extracted from ClinicalTrials.gov ( https://clinicaltrials.gov/ ), a database of clinical studies around the world. By extracting drug and AE information from ClinicalTrials.gov and structuring it into a database, drug-AEs could be established for future drug development and repositioning. To our knowledge, current AE databases contain mainly U.S. Food and Drug Administration (FDA)-approved drugs. However, our database contains both FDA-approved and experimental compounds extracted from ClinicalTrials.gov . Our database contains 8,161 clinical trials of 3,102,675 patients and 713,103 reported AEs. We extracted the information from ClinicalTrials.gov using a set of python scripts, and then used regular expressions and a drug dictionary to process and structure relevant information into a relational database. We performed data mining and pattern analysis of drug-AEs in our database. Our database can serve as a tool to assist researchers to discover drug-AE relationships for developing, repositioning, and repurposing drugs.

  6. Analysis of Individual Molecular Events of DNA Damage Response by Flow and Image Assisted Cytometry

    PubMed Central

    Darzynkiewicz, Zbigniew; Traganos, Frank; Zhao, Hong; Halicka, H. Dorota; Skommer, Joanna; Wlodkowic, Donald

    2010-01-01

    This chapter describes molecular mechanisms of DNA damage response (DDR) and presents flow- and image-assisted cytometric approaches to assess these mechanisms and measure the extent of DDR in individual cells. DNA damage was induced by cell treatment with oxidizing agents, UV light, DNA topoisomerase I or II inhibitors, cisplatin, tobacco smoke, and by exogenous and endogenous oxidants. Chromatin relaxation (decondensation) is an early event of DDR chromatin that involves modification of high mobility group proteins (HMGs) and histone H1 and was detected by cytometry by analysis of the susceptibility of DNA in situ to denaturation using the metachromatic fluorochrome acridine orange. Translocation of the MRN complex consisting of Meiotic Recombination 11 Homolog A (Mre11), Rad50 homolog and Nijmegen Breakage Syndrome 1 (NMR1) into DNA damage sites was assessed by laser scanning cytometry as the increase in the intensity of maximal pixel as well as integral value of Mre11 immunofluorescence. Examples of cytometric detection of activation of Ataxia telangiectasia mutated (ATM), and Check 2 (Chk2) protein kinases using phospho-specific Abs targeting Ser1981 and Thr68 of these proteins, respectively are also presented. We also discuss approaches to correlate activation of ATM and Chk2 with phosphorylation of p53 on Ser15 and histone H2AX on Ser139 as well as with cell cycle position and DNA replication. The capability of laser scanning cytometry to quantify individual foci of phosphorylated H2AX and/or ATM that provides more dependable assessment of the presence of DNA double-strand breaks is outlined. The new microfluidic Lab-on-a-Chip platforms for interrogation of individual cells offer a novel approach for DDR cytometric analysis. PMID:21722802

  7. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    event correlations in combination with other techniques were used in artificial intelligence . Today, organizations value event logs for more than...using a sophisticated blend of techniques from traditional statistics, artificial intelligence , and computer graphics (Westland, 1992). Data mining is...SECURITYREJECT 178964 184 Bibliography Air Intelligence Agency Public Affairs. (2006, 7/5/2006). Air force stands up first

  8. Modeling of rainfall events and trends through multifractal analysis on the Ebro River Basin

    NASA Astrophysics Data System (ADS)

    Valencia, Jose Luis; María Tarquis, Ana; Saá-Requejo, Antonio; Villeta, María; María Gascó, Jose

    2015-04-01

    Water supplies in the Ebro River Basin present high seasonal fluctuations, with extreme rainfall events during autumn and spring, and demands are increasingly stressed during summer. At the same time, repeated anomalous annual fluctuations in recent decades have become a serious concern for regional hydrology, agriculture and several related industries in the region. In fact, it has had a devastating impact, both socially and economically. In addition it has resulted in debate over the changing seasonal patterns of rainfall and the increasing frequency of extreme rainfall events. The aim of this work is to evaluate these challenges on the Ebro River Basin.For this purpose, 132 complete and regular spatial rainfall daily datasets (from 1931 to 2009) were analyzed. Each dataset corresponds to a grid of 25 km x 25 km and belongs to the area studied. First, classical statistical tests were applied to the series at annual scale to check the randomness and trends. No trends where found. Then, we analyzed the change in the rainfall variability pattern in the Ebro River Basin. We have used universal multifractal (UM) analysis, which estimates the concentration of the data around the precipitation average (C1, codimension average), the degree of multiscaling behavior in time (α index) and the maximum probable singularity in the rainfall distribution (γs). Daily rainfall series were subdivided (1931-1975 and 1965-2009) to study the difference between the two periods in these three UM parameters, in an attempt to relate them to geographical coordinates and relative positions in the river basin. The variations observed in C1 and α in some areas of the Ebro River Basin indicate that a precipitation regime change has begun in the last few decades, and therefore, this change should be considered in terms of its potential effects on the social and economical development of the region. This confirms some postulates drawn by conservative scientists who reject a catastrophic

  9. Subtropical influence on January 2009 major sudden stratospheric warming event: diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Schneidereit, Andrea; Peters, Dieter; Grams, Christian; Wolf, Gabriel; Riemer, Michael; Gierth, Franziska; Quinting, Julian; Keller, Julia; Martius, Olivia

    2015-04-01

    In January 2009 a major sudden stratospheric warming (MSSW) event occurred with the strongest NAM anomaly ever observed at 10 hPa. Also stratospheric Eliassen-Palm flux convergence and zonal mean eddy heat fluxes of ultra-long waves at 100 hPa layer were unusually strong in the mid-latitudes just before and after the onset of the MSSW. Beside internal interactions between the background flow and planetary waves and between planetary waves among themselves the subtropical tropospheric forcing of these enhanced heat fluxes is still an open question. This study investigates in more detail the dynamical reasons for the pronounced heat fluxes based on ERA-Interim re-analysis data. Investigating the regional contributions of the eddy heat flux to the northern hemispheric zonal mean revealed a distinct spatial pattern with maxima in the Eastern Pacific/North America and the Eastern North Atlantic/ Europe in that period. The first region is related with an almost persistent tropospheric blocking high (BH) over the Gulf of Alaska dominating the upper-level flow and the second region with a weaker BH over Northern Europe. The evolution of the BH over the Gulf of Alaska can be explained by a chain of tropospheric weather events linked to and maintained by subtropical and tropical influences: MJO (phase 7-8) and the developing cold phase of ENSO (La Niña), which are in coherence over the Eastern Pacific favor enhanced subtropical baroclinicity. In turn extratropical cyclone activity increases and shifts more poleward associated with an increase of the frequency of warm conveyor belts (WCB). These WCBs support enhanced poleward directed eddy heat fluxes in Eastern Pacific/North-American region. The Eastern North Atlantic/European positive heat flux anomaly is associated with a blocking high over Scandinavia. This BH is maintained by an eastward propagating Rossby wave train, emanating from the block over the Gulf of Alaska. Eddy feedback processes support this high pressure

  10. Nonparametric analysis of competing risks data with event category missing at random.

    PubMed

    Gouskova, Natalia A; Lin, Feng-Chang; Fine, Jason P

    2017-03-01

    In competing risks setup, the data for each subject consist of the event time, censoring indicator, and event category. However, sometimes the information about the event category can be missing, as, for example, in a case when the date of death is known but the cause of death is not available. In such situations, treating subjects with missing event category as censored leads to the underestimation of the hazard functions. We suggest nonparametric estimators for the cumulative cause-specific hazards and the cumulative incidence functions which use the Nadaraya-Watson estimator to obtain the contribution of an event with missing category to each of the cause-specific hazards. We derive the propertied of the proposed estimators. Optimal bandwidth is determined, which minimizes the mean integrated squared errors of the proposed estimators over time. The methodology is illustrated using data on lung infections in patients from the United States Cystic Fibrosis Foundation Patient Registry.

  11. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    NASA Astrophysics Data System (ADS)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    ' precipitation forecasts showed some skill (improvement over persistence) for lead times up to 60' for moderate intensities (up to 1 mm in 30') and up to 2.5h for lower rates (above 0.1 mm). However an important event-to-event variability has been found as illustrated by the fact that hit rates of rain-no-rain forecasts achieved the 60% value at 90' in the 7 September 2005 and only 40' in the 2 November 2008 case. The discussion of these results provides useful information on the potential application of nowcasting systems and realistic values to be contrasted with specific end-user requirements. This work has been done in the framework of the Hymex research programme and has been partly funded by the ProFEWS project (CGL2010-15892). References Bech J, N Pineda, T Rigo, M Aran, J Amaro, M Gayà, J Arús, J Montanyà, O van der Velde, 2011: A Mediterranean nocturnal heavy rainfall and tornadic event. Part I: Overview, damage survey and radar analysis. Atmospheric Research 100:621-637 http://dx.doi.org/10.1016/j.atmosres.2010.12.024 Bech J, R Pascual, T Rigo, N Pineda, JM López, J Arús, and M Gayà, 2007: An observational study of the 7 September 2005 Barcelona tornado outbreak. Natural Hazards and Earth System Science 7:129-139 http://dx.doi.org/10.5194/nhess-7-129-2007 Berenguer M, C Corral, R Sa'nchez-Diezma, D Sempere-Torres, 2005: Hydrological validation of a radarbased nowcasting technique. Journal of Hydrometeorology 6: 532-549 http://dx.doi.org/10.1175/JHM433.1 Berenguer M, D Sempere, G Pegram, 2011: SBMcast - An ensemble nowcasting technique to assess the uncertainty in rainfall forecasts by Lagrangian extrapolation. Journal of Hydrology 404: 226-240 http://dx.doi.org/10.1016/j.jhydrol.2011.04.033 Pierce C, A Seed, S Ballard, D Simonin, Z Li, 2012: Nowcasting. In Doppler Radar Observations (J Bech, JL Chau, ed.) Ch. 13, 98-142. InTech, Rijeka, Croatia http://dx.doi.org/10.5772/39054

  12. Multi-Resolution Clustering Analysis and Visualization of Around One Million Synthetic Earthquake Events

    NASA Astrophysics Data System (ADS)

    Kaneko, J. Y.; Yuen, D. A.; Dzwinel, W.; Boryszko, K.; Ben-Zion, Y.; Sevre, E. O.

    2002-12-01

    The study of seismic patterns with synthetic data is important for analyzing the seismic hazard of faults because one can precisely control the spatial and temporal domains. Using modern clustering analysis from statistics and a recently introduced visualization software, AMIRA, we have examined the multi-resolution nature of a total assemblage involving 922,672 earthquake events in 4 numerically simulated models, which have different constitutive parameters, with 2 disparately different time intervals in a 3D spatial domain. The evolution of stress and slip on the fault plane was simulated with the 3D elastic dislocation theory for a configuration representing the central San Andreas Fault (Ben-Zion, J. Geophys. Res., 101, 5677-5706, 1996). The 4 different models represent various levels of fault zone disorder and have the following brittle properties and names: uniform properties (model U), a Parkfield type Asperity (A), fractal properties (F), and multi-size-heterogeneities (model M). We employed the MNN (mutual nearest neighbor) clustering method and developed a C-program that calculates simultaneously a number of parameters related to the location of the earthquakes and their magnitude values .Visualization was then used to look at the geometrical locations of the hypocenters and the evolution of seismic patterns. We wrote an AmiraScript that allows us to pass the parameters in an interactive format. With data sets consisting of 150 year time intervals, we have unveiled the distinctly multi-resolutional nature in the spatial-temporal pattern of small and large earthquake correlations shown previously by Eneva and Ben-Zion (J. Geophys. Res., 102, 24513-24528, 1997). In order to search for clearer possible stationary patterns and substructures within the clusters, we have also carried out the same analysis for corresponding data sets with time extending to several thousand years. The larger data sets were studied with finer and finer time intervals and multi

  13. Competing events influence estimated survival probability: when is Kaplan-Meier analysis appropriate?

    PubMed

    Biau, David Jean; Latouche, Aurélien; Porcher, Raphaël

    2007-09-01

    The Kaplan-Meier estimator is the current method for estimating the probability of an event to occur with time in orthopaedics. However, the Kaplan-Meier estimator was designed to estimate the probability of an event that eventually will occur for all patients, ie, death, and this does not hold for other outcomes. For example, not all patients will experience hip arthroplasty loosening because some may die first, and some may have their implant removed to treat infection or recurrent hip dislocation. Such events that preclude the observation of the event of interest are called competing events. We suggest the Kaplan-Meier estimator is inappropriate in the presence of competing events and show that it overestimates the probability of the event of interest to occur with time. The cumulative incidence estimator is an alternative approach to Kaplan-Meier in situations where competing risks are likely. Three common situations include revision for implant loosening in the long-term followup of arthroplasties or implant failure in the context of limb-salvage surgery or femoral neck fracture.

  14. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    NASA Astrophysics Data System (ADS)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  15. Time-to-Event Analysis of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study

    ERIC Educational Resources Information Center

    Dante, Angelo; Fabris, Stefano; Palese, Alvisa

    2013-01-01

    Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…

  16. Arrests, Recent Life Circumstances, and Recurrent Job Loss for At-Risk Young Men: An Event-History Analysis

    ERIC Educational Resources Information Center

    Wiesner, Margit; Capaldi, Deborah M.; Kim, Hyoun K.

    2010-01-01

    This study used longitudinal data from 202 at-risk young men to examine effects of arrests, prior risk factors, and recent life circumstances on job loss across a 7-year period in early adulthood. Repeated failure-time continuous event-history analysis indicated that occurrence of job loss was primarily related to prior mental health problems,…

  17. Multi-instrumental analysis of large sprite events and their producing storm in southern France

    NASA Astrophysics Data System (ADS)

    Soula, S.; Iacovella, F.; van der Velde, O.; Montanyà, J.; Füllekrug, M.; Farges, T.; Bór, J.; Georgis, J.-F.; NaitAmor, S.; Martin, J.-M.

    2014-01-01

    During the night of 01-02 September, 2009, seventeen distinct sprite events including 3 halos were observed above a storm in north-western Mediterranean Sea, with a video camera at Pic du Midi (42.93N; 0.14E; 2877 m). The sprites occurred at distances between 280 and 390 km which are estimated based on their parent CG location. The MCS-type storm was characterized by a trailing-stratiform structure and a very circular shape with a size of about 70,000 km2 (cloud top temperature lower than - 35 °C) when the TLEs were observed. The cloud to ground (CG) flash rate was large (45 min- 1) one hour before the TLE observation and very low (< 5 min- 1) during it. Out of the 17 sprite events, 15 parent + CG (P + CG) strokes have been identified and their average peak current is 87 kA (67 kA for the 14 events without halo), while the associated charge moment changes (CMC) that could be determined, range from 424 to 2088 ± 20% C km. Several 2-second videos contain multiple sprite events: one with four events, one with three events and three with two events. Column and carrot type sprites are identified, either together or separately. All P + CG strokes are clearly located within the stratiform region of the storm and the second P + CG stroke of a multiple event is back within the stratiform region. Groups of large and bright carrots reach ~ 70 km height and ~ 80 km horizontal extent. These groups are associated with a second pulse of electric field radiation in the ELF range which occurs ~ 5 ms after the P + CG stroke and exhibits the same polarity, which is evidence for current in the sprite body. VLF perturbations associated with the sprite events were recorded with a station in Algiers.

  18. Prospective Analysis Of Neuropsychiatric Events In An International Disease Inception Cohort of SLE Patients

    PubMed Central

    Hanly, J. G.; Urowitz, M. B.; Su, L.; Bae, S.C.; Gordon, C.; Wallace, D.J.; Clarke, A.; Bernatsky, S.; Isenberg, D.; Rahman, A.; Alarcón, G.S.; Gladman, D.D.; Fortin, P.R.; Sanchez-Guerrero, J.; Romero-Diaz, J.; Merrill, J. T.; Ginzler, E.; Bruce, I. N.; Steinsson, K.; Khamashta, M.; Petri, M.; Manzi, S.; Dooley, M.A.; Ramsey-Goldman, R.; Van Vollenhoven, R.; Nived, O.; Sturfelt, G.; Aranow, C.; Kalunian, K.; Ramos-Casals, M.; Zoma, A.; Douglas, J.; Thompson, K.; Farewell, V.

    2010-01-01

    Objectives To determine the frequency, accrual, attribution and outcome of neuropsychiatric (NP) events and impact on quality of life over 3 years in a large inception cohort of SLE patients. Methods The study was conducted by the Systemic Lupus International Collaborating Clinics. Patients were enrolled within 15 months of SLE diagnosis. NP events were identified using the ACR case definitions and decision rules were derived to determine the proportion of NP disease attributable to SLE. The outcome of NP events was recorded and patient perceived impact determined by the SF-36. Results There were 1206 patients (89.6% female) with a mean (±SD) age of 34.5±13.2 years. The mean disease duration at enrollment was 5.4±4.2 months. Over a mean follow-up of 1.9±1.2 years 486/1206 (40.3%) patients had ≥1 NP events which were attributed to SLE in 13.0%–23.6% of patients using two a priori decision rules. The frequency of individual NP events varied from 47.1% (headache) to 0% (myasthenia gravis). The outcome was significantly better for those NP events attributed to SLE especially if they occurred within 1.5 years of the diagnosis of SLE. Patients with NP events, regardless of attribution, had significantly lower summary scores for both mental and physical health over the study. Conclusions NP events in SLE patients are variable in frequency, most commonly present early in the disease course and adversely impact patients’ quality of life over time. Events attributed to non-SLE causes are more common than those due to SLE, although the latter have a more favourable outcome. PMID:19359262

  19. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage "FOD" Events

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Lawrence, Charles; Litt, Jonathan S.

    2007-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  20. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage (FOD) Events

    NASA Technical Reports Server (NTRS)

    Turso, James; Lawrence, Charles; Litt, Jonathan

    2004-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  1. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    PubMed Central

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-01-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses. PMID:27706086

  2. Applying Differentially Variable Component Analysis (dVCA) to Event-related Potentials

    NASA Astrophysics Data System (ADS)

    Shah, Ankoor S.; Knuth, Kevin H.; Lakatos, Peter; Schroeder, Charles E.

    2004-04-01

    Event-related potentials (ERPs) generated in response to multiple presentations of the same sensory stimulus vary from trial to trial. Accumulating evidence suggests that this variability relates to a similar trial-to-trial variation in the perception of the stimulus. In order to understand this variability, we previously developed differentially Variable Component Analysis (dVCA) as a method for defining dynamical components that contribute to the ERP. The underlying model asserted that: (i) multiple components comprise the ERP; (ii) these components vary in amplitude and latency from trial to trial; and (iii) these components may co-vary. A Bayesian framework was used to derive maximum a posteriori solutions to estimate these components and their latency and amplitude variability. Our original goal in developing dVCA was to produce a method for automated estimation of components in ERPs. However, we discovered that it is better to apply the algorithm in stages because of the complexity of the ERP and to use the results to define interesting subsets of the data, which are further analyzed independently. This paper describes this method and illustrates its application to actual neural signals recorded in response to a visual stimulus. Interestingly, dVCA of these data suggests two distinct response modes (or states) with differing components and variability. Furthermore, analyses of residual signals obtained by subtracting the estimated components from the actual data illustrate gamma-frequency (circa 40 Hz) oscillations, which may underlie communication between various brain regions. These findings demonstrate the power of dVCA and underscore the necessity to apply this algorithm in a guided rather than a ballistic fashion. Furthermore, they highlight the need to examine the residual signals for those features of the signals that were not anticipated and not modeled in the derivation of the algorithm.

  3. Single-event-related potential analysis by means of fragmentary decomposition.

    PubMed

    Melkonian, D; Gordon, E; Bahramali, H

    2001-09-01

    A recently developed fragmentary decomposition method is employed to analyse single-trial event-related potentials (ERPs), thereby extending the traditional method of averaging. Using a conventional auditory oddball paradigm with 40 target stimuli, single-trial ERPs in 40 normal subjects were analysed for midline scalp (Fz, Cz and Pz) recording sites. The normalization effect, reported in our previous study of eye blink EMGs and proposed to be a characteristic property of a wide class of non-stationary physiological processes, was found to apply to these single-trial ERPs. Fragmentary decomposition of single-trial ERPs may be regarded as re-statement of the normalization effect. This allows both pre-stimulus EEGs and post-stimulus ERPs to be regarded as overlapping generic mass potentials (GMPs), with a characteristic Gaussian amplitude spectrum. On theoretical and empirical grounds we uniquely deduce a model GMP using an introduced d" function, and physically support it by the resting and transient conditions. The model takes into account the shape of the component, which suggests a simple relationship between the peak latency and the time of the component onset. Given that GMPs may be manipulated and sorted out, we present principles of the fragmentary synthesis, i.e. probabilistic ERP reconstructions on the basis of individual and ensemble properties of its identified components. Summarizing the component quantification in the form of the dynamic model provides for the first time the opportunity to quantify all significant components in single-trial ERPs. This method of single-trial analysis opens up new possibilities of exploring the dynamical ERP changes within a recording trial, particularly in late component "cognitive" paradigms.

  4. Tracing footprints of environmental events in tree ring chemistry using neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Sahin, Dagistan

    The aim of this study is to identify environmental effects on tree-ring chemistry. It is known that industrial pollution, volcanic eruptions, dust storms, acid rain and similar events can cause substantial changes in soil chemistry. Establishing whether a particular group of trees is sensitive to these changes in soil environment and registers them in the elemental chemistry of contemporary growth rings is the over-riding goal of any Dendrochemistry research. In this study, elemental concentrations were measured in tree-ring samples of absolutely dated eleven modern forest trees, grown in the Mediterranean region, Turkey, collected and dated by the Malcolm and Carolyn Wiener Laboratory for Aegean and Near Eastern Dendrochronology laboratory at Cornell University. Correlations between measured elemental concentrations in the tree-ring samples were analyzed using statistical tests to answer two questions. Does the current concentration of a particular element depend on any other element within the tree? And, are there any elements showing correlated abnormal concentration changes across the majority of the trees? Based on the detailed analysis results, the low mobility of sodium and bromine, positive correlations between calcium, zinc and manganese, positive correlations between trace elements lanthanum, samarium, antimony, and gold within tree-rings were recognized. Moreover, zinc, lanthanum, samarium and bromine showed strong, positive correlations among the trees and were identified as possible environmental signature elements. New Dendrochemistry information found in this study would be also useful in explaining tree physiology and elemental chemistry in Pinus nigra species grown in Turkey. Elemental concentrations in tree-ring samples were measured using Neutron Activation Analysis (NAA) at the Pennsylvania State University Radiation Science and Engineering Center (RSEC). Through this study, advanced methodologies for methodological, computational and

  5. Two damaging hydrogeological events in Calabria, September 2000 and November 2015. Comparative analysis of causes and effects

    NASA Astrophysics Data System (ADS)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela

    2016-04-01

    Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and

  6. Analysis of Effects of Sensor Multithreading to Generate Local System Event Timelines

    DTIC Science & Technology

    2014-03-27

    xi I. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Background... 1 1.2 Research Goals and Objectives . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3...32 4.1 Mean percentages of events captured while increasing maximum number of NETSTAT processes by 1 (low load

  7. Incidence and relative risk of hemorrhagic events associated with ramucirumab in cancer patients: a systematic review and meta-analysis

    PubMed Central

    Zhang, Fei; Sun, Peng; Zheng, Xucai; Zhu, Yi; Wang, Qing; He, Jie

    2016-01-01

    The purpose of this study was to investigate the overall incidence and relative risk (RR) of hemorrhagic events in cancer patients treated with ramucirumab. 298 potentially relevant citations on ramucirumab from Pubmed, Web of Science and the Cochrane Database, as well as abstracts presented at conferences (all up to March 2016) were identified through our initial search. Only phase II and III prospective clinical trials of ramucirumab among cancer patients with toxicity records on hemorrhagic events were selected for final analysis. Data was extracted from the original studies by two independent reviewers. The overall incidence, RR, and 95% confidence intervals (CI) were calculated using fixed or random effects models according to the heterogeneity of the enrolled studies. The statistical analysis was performed by STATA version 11.0 (Stata Corporation, College Station, TX). 4963 patients with a variety of solid tumors from eleven eligible studies were selected into our analysis. The results demonstrated that the overall incidences of all-grade and high-grade hemorrhagic events in cancer patients were 27.6% (95% CI, 18.7-36.5%) and 2.3% (95% CI, 1.3-3.2%), respectively. The RR of hemorrhagic events of ramucirumab compared to control was significantly increased for low-grade (RR, 2.06; 95% CI, 1.85-2.29, p < 0.001), but not for high-grade (RR, 1.19, 95% CI, 0.80-1.76, p=0.39) hemorrhagic events. Hemorrhagic events associated with ramucirumab are modest and manageable while patients could continue to receive ramucizumab treatment to achieve their maximum clinical benefits. PMID:27507055

  8. Predictors of seeking emergency medical help during overdose events in a provincial naloxone distribution programme: a retrospective analysis

    PubMed Central

    Ambrose, Graham; Amlani, Ashraf; Buxton, Jane A

    2016-01-01

    Objectives This study sought to identify factors that may be associated with help-seeking by witnesses during overdoses where naloxone is administered. Setting Overdose events occurred in and were reported from the five regional health authorities across British Columbia, Canada. Naloxone administration forms completed following overdose events were submitted to the British Columbia Take Home Naloxone programme. Participants All 182 reported naloxone administration events, reported by adult men and women and occurring between 31 August 2012 and 31 March 2015, were considered for inclusion in the analysis. Of these, 18 were excluded: 10 events which were reported by the person who overdosed, and 8 events for which completed forms did not indicate whether or not emergency medical help was sought. Primary and secondary outcome measures Seeking emergency medical help (calling 911), as reported by participants, was the sole outcome measure of this analysis. Results Medical help was sought (emergency services—911 called) in 89 (54.3%) of 164 overdoses where naloxone was administered. The majority of administration events occurred in private residences (50.6%) and on the street (23.4%), where reported rates of calling 911 were 27.5% and 81.1%, respectively. Overdoses occurring on the street (compared to private residence) were significantly associated with higher odds of calling 911 in multivariate analysis (OR=10.68; 95% CI 2.83 to 51.87; p<0.01), after adjusting for other variables. Conclusions Overdoses occurring on the street were associated with higher odds of seeking emergency medical help by responders. Further research is needed to determine if sex and stimulant use by the person who overdosed are associated with seeking emergency medical help. The results of this study will inform interventions within the British Columbia Take Home Naloxone programme and other jurisdictions to encourage seeking emergency medical help. PMID:27329442

  9. Low Probability Tail Event Analysis and Mitigation in the BPA Control Area

    SciTech Connect

    Lu, Shuai; Brothers, Alan J.; McKinstry, Craig A.; Jin, Shuangshuang; Makarov, Yuri V.

    2010-10-31

    This report investigated the uncertainties with the operations of the power system and their contributions to tail events, especially under high penetration of wind. A Bayesian network model is established to quantify the impact of these uncertainties on system imbalance. The framework is presented for a decision support tool, which can help system operators better estimate the need for balancing reserves and prepare for tail events.

  10. Analysis of geohazards events along Swiss roads from autumn 2011 to present

    NASA Astrophysics Data System (ADS)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    In Switzerland, roads and railways are threatened throughout the year by several natural hazards. Some of these events reach transport infrastructure many time per year leading to the closing of transportation corridors, loss of access, deviation travels and sometimes infrastructures damages and loss of human lives (3 fatalities during the period considered). The aim of this inventory of events is to investigate the number of natural events affecting roads and railways in Switzerland since autumn 2011 until now. Natural hazards affecting roads and railway can be classified in five categories: rockfalls, landslides, debris flows, snow avalanches and floods. They potentially cause several important direct damages on transportation infrastructure (roads, railway), vehicles (slightly or very damaged) or human life (slightly or seriously injured person, death). These direct damages can be easily evaluated from press articles or from Swiss police press releases. Indirect damages such as deviation cost are not taken into account in this work. During the two a half last years, about 50 events affecting the Swiss roads and Swiss railways infrastructures were inventoried. The proportion of events due to rockfalls is 45%, to landslides 25%, to debris flows 15%, to snow avalanches 10% and to floods 5%. During this period, three fatalities and two persons were injured while 23 vehicles (car, trains and coach) and 24 roads and railways were damaged. We can see that floods occur mainly on the Swiss Plateau whereas rockfalls, debris flow, snow avalanches and landslides are mostly located in the Alpine area. Most of events occur on secondary mountain roads and railways. The events are well distributed on the whole Alpine area except for the Gotthard hotspot, where an important European North-South motorway (hit in 2003 with two fatalities) and railway (hit three times in 2012 with one fatalities) are more frequently affected. According to the observed events in border regions of

  11. The "Worried Well" Response to CBRN Events: Analysis and Solutions

    DTIC Science & Technology

    2007-06-01

    to play the sick role. Group 5. Those experiencing stress disorders . Traumatic events can spur serious psychiatric illness. Acute Stress Disorder ...health threat in a WMD event.126 In the Aum case, the rate of Post Traumatic Stress Disorder varies depending on the study and time elapsed. A...64 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98

  12. A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION

    SciTech Connect

    J. M. Lacy; S. R. Novascone; W. D. Richins; T. K. Larson

    2007-08-01

    New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems

  13. A systematic review and meta-analysis assessing adverse event profile and tolerability of nicergoline

    PubMed Central

    Fioravanti, Mario; Nakashima, Taku; Xu, Jun; Garg, Amit

    2014-01-01

    Objective To evaluate the safety profile of nicergoline compared with placebo and other active agents from published randomised controlled trials. Design Systematic review and meta-analysis of nicergoline compared with placebo and other active agents across various indications. Data sources MEDLINE, Medline-in-process, Cochrane, EMBASE, EMBASE alerts, Cochrane Central Register of Controlled Trials (CENTRAL), Cochrane Database of Systematic Reviews (CDSR) and Cochrane Methodology Register (CMR) for all the randomised controlled trials, open-label or blinded, in adults treated with nicergoline. Studies published until August 2013 were included. Review method 29 studies were included for data extraction. The studies included in this review were majorly from European countries and mostly in cerebrovascular disease (n=15) and dementia (n=8). Results The treatment withdrawals were comparatively lower in the nicergoline group as compared with the placebo group (RR=0.92; 95% CI 0.7 to 1.21) and other active comparators (RR=0.45; 95% CI 0.10 to 1.95), but the difference was non-significant. Incidence of any adverse events (AEs) was slightly higher (RR=1.05; 95% CI 0.93 to 1.2) while incidence of serious AEs was lower (RR=0.85; 95% CI 0.50 to 1.45) in the nicergoline compared with placebo group. Frequency of anxiety was significantly lower in nicergoline as compared with placebo (p=0.01). Other AEs including diarrhoea, gastric upset, dizziness and drowsiness were less frequent in the nicergoline group when compared with placebo/active drugs, but the difference was non-significant. Frequency of hypotension and hot flushes was slightly higher in the nicergoline group but the difference was non-significant. None of the studies reported any incidence of fibrosis or ergotism with nicergoline treatment. Conclusions Nicergoline is an ergot derivative, but its safety profile is better than other ergot derivatives like ergotamine and ergotoxine. This systematic review and meta-analysis

  14. Random-Effects Meta-Analysis of Time-to-Event Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators

    ERIC Educational Resources Information Center

    Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.

    2013-01-01

    Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…

  15. Multivariate spatial analysis of a heavy rain event in a densely populated delta city

    NASA Astrophysics Data System (ADS)

    Gaitan, Santiago; ten Veldhuis, Marie-claire; Bruni, Guenda; van de Giesen, Nick

    2014-05-01

    Delta cities account for half of the world's population and host key infrastructure and services for the global economic growth. Due to the characteristic geography of delta areas, these cities face high vulnerability to extreme weather and pluvial flooding risks, that are expected to increase as climate change drives heavier rain events. Besides, delta cities are subjected to fast urban densification processes that progressively make them more vulnerable to pluvial flooding. Delta cities need to be adapted to better cope with this threat. The mechanism leading to damage after heavy rains is not completely understood. For instance, current research has shown that rain intensities and volumes can only partially explain the occurrence and localization of rain-related insurance claims (Spekkers et al., 2013). The goal of this paper is to provide further insights into spatial characteristics of the urban environment that can significantly be linked to pluvial-related flooding impacts. To that end, a study-case has been selected: on October 12 to 14 2013, a heavy rain event triggered pluvial floods in Rotterdam, a densely populated city which is undergoing multiple climate adaptation efforts and is located in the Meuse river Delta. While the average yearly precipitation in this city is around 800 mm, local rain gauge measurements ranged from aprox. 60 to 130 mm just during these three days. More than 600 citizens' telephonic complaints reported impacts related to rainfall. The registry of those complaints, which comprises around 300 calls made to the municipality and another 300 to the fire brigade, was made available for research. Other accessible information about this city includes a series of rainfall measurements with up to 1 min time-step at 7 different locations around the city, ground-based radar rainfall data (1 Km^2 spatial resolution and 5 min time-step), a digital elevation model (50 cm of horizontal resolution), a model of overland-flow paths, cadastral

  16. On the analysis of an extreme Bora wind event over the northern Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Colucci, R. R.; Pucillo, A.

    2010-09-01

    On 10th March 2010 a severe Bora wind event affected the Friuli Venezia Giulia region, northeastern Italy, in particular the gulf of Trieste area (northern Adriatic Sea). Such event has been driven by a widespread westward moving cold pool aloft, coming from the Western Asia, that brought an intense potential vorticity anomaly over the western Mediterranean Sea. It determined a deep cyclogenesis involving all the troposphere. The pressure gradient force in the lowest layers forced a northeastern wind to blow with noticeable strength over the gulf of Trieste area and the Karstic region. The mean ground wind velocity has reached values above 27 m/s (about 100 km/h) for several hours, and maximum gusts exceeded 42 m/s (about 150 km/h) over Trieste town. The northeastern sector of the Adriatic Sea is frequently affected by strong Bora events in particular during the winter semester. This is a characteristic local wind mostly influenced by the orography of the Karstic relieves to the east of Trieste town. The target of this work is to assess the climatological relevance of such an event by comparing it with the most representative events of the past. It has been possible thanks to the long term archive of meteorological observations in Trieste site (I.R. Accademia di Commercio e Nautica, Regio Comitato Talassografico Italiano, Ministero dell'Agricoltura e Foreste, Consiglio Nazionale delle Ricerche): we have found out that this is one of the ten strongest Bora event along the 1871-2010 period. Considerations about the trend and frequency of severe Bora events have been proposed.

  17. An electronic processing system for cosmic X-ray event analysis

    NASA Astrophysics Data System (ADS)

    Dedhia, D. K.; Shah, M. R.

    1991-08-01

    An electronic logic system has been developed to evaluate and process X-ray events in 20-100 keV energy range from multi-cell xenon filled proportional counters used in X-ray astronomy. The electronic system consists of X-ray event selection logic, pulse height analyzer, K-fluorescent gating and arrival time tagging. Using 'K-fluorescent gating technique', improved energy resolution for incident X-ray energies above 34 keV is achieved. The X-ray event selection logic is designed to obtain higher background rejection efficiency for charged particles and Compton events. It provides significant advantage in studying weak cosmic X-ray sources as well as detecting spectral line features in the field of hard X-ray spectroscopy from balloon-borne telescope. The telemetry system used is formatting the event location and digitized energy information with a dead time of 1.28 ms. To reduce the dead time of the system, buffer memories are used with proper time tagging.

  18. Towards a unified study of extreme events using universality concepts and transdisciplinary analysis methods

    NASA Astrophysics Data System (ADS)

    Balasis, George; Donner, Reik V.; Donges, Jonathan F.; Radebach, Alexander; Eftaxias, Konstantinos; Kurths, Jürgen

    2013-04-01

    The dynamics of many complex systems is characterized by the same universal principles. In particular, systems which are otherwise quite different in nature show striking similarities in their behavior near tipping points (bifurcations, phase transitions, sudden regime shifts) and associated extreme events. Such critical phenomena are frequently found in diverse fields such as climate, seismology, or financial markets. Notably, the observed similarities include a high degree of organization, persistent behavior, and accelerated energy release, which are common to (among others) phenomena related to geomagnetic variability of the terrestrial magnetosphere (intense magnetic storms), seismic activity (electromagnetic emissions prior to earthquakes), solar-terrestrial physics (solar flares), neurophysiology (epileptic seizures), and socioeconomic systems (stock market crashes). It is an open question whether the spatial and temporal complexity associated with extreme events arises from the system's structural organization (geometry) or from the chaotic behavior inherent to the nonlinear equations governing the dynamics of these phenomena. On the one hand, the presence of scaling laws associated with earthquakes and geomagnetic disturbances suggests understanding these events as generalized phase transitions similar to nucleation and critical phenomena in thermal and magnetic systems. On the other hand, because of the structural organization of the systems (e.g., as complex networks) the associated spatial geometry and/or topology of interactions plays a fundamental role in the emergence of extreme events. Here, a few aspects of the interplay between geometry and dynamics (critical phase transitions) that could result in the emergence of extreme events, which is an open problem, will be discussed.

  19. Inverse modeling of storm intensity based on grain-size analysis of hurricane-induced event beds

    NASA Astrophysics Data System (ADS)

    Castagno, K. A.; Donnelly, J. P.

    2015-12-01

    As the coastal population continues to grow in size and wealth, increased hurricane frequency and intensity present a growing threat of property damage and loss of life. Recent reconstructions of past intense-hurricane landfalls from sediment cores in southeastern New England identify a series of active intervals over the past 2,000 years, with the last few centuries among the most quiescent intervals. The frequency of intense-hurricane landfalls in southeastern New England is well constrained, but the intensity of these storms, particularly prehistoric events, is not. This study analyzes the grain sizes of major storm event beds along a transect of sediment cores in Salt Pond, Falmouth, MA. Several prehistoric events contain more coarse material than any of the deposits from the historical interval, suggesting that landfalling hurricanes in the northeastern United States may have been more intense than the historically encountered category 2 and 3 storms. The intensity of major storm events is estimated using grain-size analysis with a digital image processing, size, and shape analyzer. Since event deposits in Salt Pond result from a combination of coastal inundation and wave action, a large population of both historical and synthetic storms is used to assess the storm characteristics that could result in the wave heights inversely modeled from grain size trends. Intense-hurricane activity may be closely tied to warming in sea surface temperature. As such, the prehistoric intervals of increased frequency and intensity provide potential analogs for current and future hurricane risk in the northeastern United States.

  20. Tracking the evolution of stream DOM source during storm events using end member mixing analysis based on DOM quality

    NASA Astrophysics Data System (ADS)

    Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin

    2015-04-01

    The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.

  1. Antipsychotics-Associated Serious Adverse Events in Children: An Analysis of the FAERS Database

    PubMed Central

    Kimura, Goji; Kadoyama, Kaori; Brown, J.B.; Nakamura, Tsutomu; Miki, Ikuya; Nisiguchi, Kohshi; Sakaeda, Toshiyuki; Okuno, Yasushi

    2015-01-01

    Objective: The reports submitted to the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) from 1997 to 2011 were reviewed to assess serious adverse events induced by the administration of antipsychotics to children. Methods: Following pre-processing of FAERS data by elimination of duplicated records as well as adjustments to standardize drug names, reports involving haloperidol, olanzapine, quetiapine, clozapine, ziprasidone, risperidone, and aripiprazole were analyzed in children (age 0-12). Signals in the data that signified a drug-associated adverse event were detected via quantitative data mining algorithms. The algorithms applied to this study include the empirical Bayes geometric mean, the reporting odds ratio, the proportional reporting ratio, and the information component of a Bayesian confidence propagation neural network. Neuroleptic malignant syndrome (NMS), QT prolongation, leukopenia, and suicide attempt were focused on as serious adverse events. Results: In regard to NMS, the signal scores for haloperidol and aripiprazole were greater than for other antipsychotics. Significant signals of the QT prolongation adverse event were detected only for ziprasidone and risperidone. With respect to leukopenia, the association with clozapine was noteworthy. In the case of suicide attempt, signals for haloperidol, olanzapine, quetiapine, risperidone, and aripiprazole were detected. Conclusions: It was suggested that there is a level of diversity in the strength of the association between various first- and second-generation antipsychotics with associated serious adverse events, which possibly lead to fatal outcomes. We recommend that research be continued in order to gather a large variety and quantity of related information, and that both available and newly reported data be placed in the context of multiple medical viewpoints in order to lead to improved levels of care. PMID:25589889

  2. gPhoton: Time-tagged GALEX photon events analysis tools

    NASA Astrophysics Data System (ADS)

    Million, Chase C.; Fleming, S. W.; Shiao, B.; Loyd, P.; Seibert, M.; Smith, M.

    2016-03-01

    Written in Python, gPhoton calibrates and sky-projects the ~1.1 trillion ultraviolet photon events detected by the microchannel plates on the Galaxy Evolution Explorer Spacecraft (GALEX), archives these events in a publicly accessible database at the Mikulski Archive for Space Telescopes (MAST), and provides tools for working with the database to extract scientific results, particularly over short time domains. The software includes a re-implementation of core functionality of the GALEX mission calibration pipeline to produce photon list files from raw spacecraft data as well as a suite of command line tools to generate calibrated light curves, images, and movies from the MAST database.

  3. Analysis of a sequence of energetic ion and magnetic field events upstream from the Saturnian magnetosphere

    NASA Astrophysics Data System (ADS)

    Krimigis, S. M.; Sergis, N.; Dialynas, K.; Mitchell, D. G.; Hamilton, D. C.; Krupp, N.; Dougherty, M.; Sarris, E. T.

    2009-12-01

    The existence of energetic particle events to ˜200 RS upstream and ˜1300 RS downstream of Saturn was established during the Voyager 1, 2 flybys in 1980 and 1981, respectively. The origin of the events could not be determined with certainty because of lack of particle charge state and species measurements at lower (<300 keV) energies, which dominate the spectra. High sensitivity observations of energetic ion directional intensities, energy spectra, and ion composition were obtained by the Ion and Neutral Camera (INCA) of the Magnetospheric IMaging Instrument (MIMI) complement, with a geometry factor of ˜2.5 cm 2 sr and some capability of separating light (H, He) and heavier (C, N, O) ion groups (henceforth referred to as 'hydrogen' and 'oxygen', respectively). Charge state information was provided where possible by the Charge-Energy-Mass Spectrometer (CHEMS) over the range ˜3-235 keV per charge, and magnetic field (IMF) data by the MAG instrument on Cassini. The observations revealed the presence of distinct upstream bursts of energetic hydrogen and oxygen ions whenever the IMF connected the spacecraft to the planetary bow shock to distances >80 RS. The events exhibited the following characteristics: (1) hydrogen ion bursts are observed in the energy range 3-220 keV (and occasionally to E>220 keV) and oxygen ion bursts in the energy range 32 to ˜700 keV. (2) Pitch angle distributions are initially anisotropic with ions moving away from the bow shock along the IMF, but tend to isotropize as the event progresses in time. (3) The duration of the ion bursts is several minutes up to 4 h. (4) The event examined in this study contains significant fluxes of singly charged oxygen. (5) Ion bursts are accompanied by distinct diamagnetic field depressions with β>10, and exhibit wave structures consistent with ion cyclotron waves for O + and O ++. Given the magnetic field configuration during the detection of the events and that energetic ions trapped within the

  4. Analysis of shallow failures triggered by the 14-16 November 2002 event in the Albaredo valley, Valtellina (Northern Italy)

    NASA Astrophysics Data System (ADS)

    Dapporto, S.; Aleotti, P.; Casagli, N.; Polloni, G.

    2005-09-01

    On 14-16 November 2002 the North Italy was affected by an intense rainfall event: in the Albaredo valley (Valtellina) more than 200 mm of rain fell triggering about 50 shallow landslides, mainly soil slips and soil slip-debris flows. Landslides occurred above the critical rainfall thresholds computed by Cancelli and Nova (1985) and Ceriani et al. (1994) for the Italian Central Alps: in fact the cumulative precipitation at the soil slips initiation time was 230 mm (in two days) with a peak intensity of 15 mm/h. A coupled analysis of seepage and instability mechanisms is performed in order to evaluate the potential for slope failure during the event. Changes in positive and negative pore water pressures during the event are modelled by a finite element analysis of water flow in transient conditions, using as boundary condition for the nodes along the slope surface the recorded rainfall rate. The slope stability analysis is conducted applying the limit equilibrium method, using pore water pressure distributions obtained in the different time steps by the seepage analysis as input data for the calculation of the factor of safety.

  5. Impacts of extreme temperature events on mortality: analysis over individual seasons

    NASA Astrophysics Data System (ADS)

    Kysely, J.; Plavcova, E.; Kyncl, J.; Kriz, B.; Pokorna, L.

    2009-04-01

    Extreme temperature events influence human society in many ways, including impacts on morbidity and mortality. While the effects of hot summer periods are relatively direct in mid-latitudinal regions, much less is known and little consensus has been achieved about possible consequences of both positive and negative temperature extremes in other parts of year. The study examines links between spells of hot and cold temperature anomalies and daily all-cause (total) mortality and mortality due to cardiovascular diseases in the population of the Czech Republic (central Europe) in individual seasons (DJF, MAM, JJA, SON). The datasets cover the period 1986-2006. Hot (cold) spells are defined in terms of anomalies of average daily temperature from the mean annual cycle as periods of at least 2 successive days on which the anomalies are above (below) the 95% (5%) quantile of the empirical distribution of the anomalies. Excess daily mortality is established by calculating deviations of the observed number of deaths and the expected number of deaths, which takes into account effects of long-term changes in mortality and the annual cycle. Periods when mortality is affected by influenza and acute respiratory infection outbreaks have been identified and excluded from the datasets before the analysis. The study is carried out for several population groups in order to identify dependence of the mortality impacts on age and gender; in particular, we focus on differences in the impacts on the elderly (70+ yrs) and younger age groups (0-69 yrs). Although results for hot- and cold-related mortality are less conclusive in the other seasons outside summer, significant links are found in several cases. The analysis reveals that - the largest effects of either hot or cold spells are observed for hot spells in JJA, with a 14% (16%) increase in mortality for the 1-day lag for all ages (70+ yrs); - much smaller but still significant effects are associated with hot spells in MAM; - the

  6. Heat budget analysis of Northern Hemisphere high-latitude spring onset events

    NASA Astrophysics Data System (ADS)

    He, Jia; Black, Robert X.

    2016-09-01

    Regional spring onset events are identified within four high-latitude sectors: the primary (critical) region over North Siberia (CR), Greenland-North America (G-NA), East Asia (EA), and Alaska (AL). To identify the primary forcing of the rapid temperature increases observed during spring onset, the contributions to the near-surface air temperature anomaly tendency are diagnosed within the thermodynamic equation for each of the four regional event categories. For each region, anomalous eddy heat flux convergence is the primary contributor to regional warming prior to, and during the early stages of, spring onset (through day +5). Thereafter, horizontal advection of the climatological-mean temperature by the large-scale circulation anomaly field emerges as the leading contributor to regional warming (during the later stages of spring onset). A parallel diagnostic of storm track strength (using the envelope function) reveals a systematic weakening of eddy activity within each region, leading to a reduction in the northward eddy heat flux out of the domain and an accumulation of heat within the region. For CR, G-NA, and EA events, an east-west dipole structure in the sea level pressure anomaly field, with lower (higher) pressure to the west (east), generates the anomalous southerly flow linked to late period linear warm advection. Our results indicate that anomalous dynamical processes associated with synoptic eddy activity and stationary wave patterns are the primary contributors to rapid temperature increase during Arctic spring onset events, with minimal contributions from anomalous diabatic processes.

  7. Transition Region Explosive Events in He II 304Å: Observation and Analysis

    NASA Astrophysics Data System (ADS)

    Rust, Thomas; Kankelborg, Charles C.

    2016-05-01

    We present examples of transition region explosive events observed in the He II 304Å spectral line with the Multi Order Solar EUV Spectrograph (MOSES). With small (<5000 km) spatial scale and large non-thermal (100-150 km/s) velocities these events satisfy the observational signatures of transition region explosive events. Derived line profiles show distinct blue and red velocity components with very little broadening of either component. We observe little to no emission from low velocity plasma, making the plasmoid instability reconnection model unlikely as the plasma acceleration mechanism for these events. Rather, the single speed, bi-directional jet characteristics suggested by these data are consistent with acceleration via Petschek reconnection.Observations were made during the first sounding rocket flight of MOSES in 2006. MOSES forms images in 3 orders of a concave diffraction grating. Multilayer coatings largely restrict the passband to the He II 303.8Å and Si XI 303.3Å spectral lines. The angular field of view is about 8.5'x17', or about 20% of the solar disk. These images constitute projections of the volume I(x,y,λ), the intensity as a function of sky plane position and wavelength. Spectral line profiles are recovered via tomographic inversion of these projections. Inversion is carried out using a multiplicative algebraic reconstruction technique.

  8. An Observational Analysis of Coaching Behaviors for Career Development Event Teams: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.

    2016-01-01

    School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…

  9. Parental Separation and Child Aggressive and Internalizing Behavior: An Event History Calendar Analysis

    ERIC Educational Resources Information Center

    Averdijk, Margit; Malti, Tina; Eisner, Manuel; Ribeaud, Denis

    2012-01-01

    This study investigated the relationship between parental separation and aggressive and internalizing behavior in a large sample of Swiss children drawn from the ongoing Zurich Project on the Social Development of Children and Youths. Parents retrospectively reported life events and problem behavior for the first 7 years of the child's life on a…

  10. An analysis of extreme intraseasonal rainfall events during January-March 2010 over eastern China

    NASA Astrophysics Data System (ADS)

    Yao, Suxiang; Huang, Qian

    2016-09-01

    The precipitation over eastern China during January-March 2010 exhibited a marked intraseasonal oscillation (ISO) and a dominant period of 10-60 days. There were two active intraseasonal rainfall periods. The physical mechanisms responsible for the onset of the two rainfall events were investigated using ERA-interim data. In the first ISO event, anomalous ascending motion was triggered by vertically integrated (1000-300 hPa) warm temperature advection. In addition to southerly anomalies on the intraseasonal (10-60-day) timescale, synoptic-scale southeasterly winds helped advect warm air from the South China Sea and western Pacific into the rainfall region. In the second ISO event, anomalous convection was triggered by a convectively unstable stratification, which was caused primarily by anomalous moisture advection in the lower troposphere (1000-850 hPa) from the Bay of Bengal and the Indo-China Peninsula. Both the intraseasonal and the synoptic winds contributed to the anomalous moisture advection. Therefore, the winter intraseasonal rainfall events over East Asia in winter could be affected not only by intraseasonal activities but also by higher frequency disturbances.

  11. Further Analysis of Variables That Affect Self-Control with Aversive Events

    ERIC Educational Resources Information Center

    Perrin, Christopher J.; Neef, Nancy A.

    2012-01-01

    The purpose of this study was to examine variables that affect self-control in the context of academic task completion by elementary school children with autism. In the baseline assessment of Study 1, mathematics problem completion was shown to be an aversive event, and sensitivity to task magnitude, task difficulty, and delay to task completion…

  12. An Analysis of Personal Event Narratives Produced by School-Age Children.

    ERIC Educational Resources Information Center

    Crow, Kristina M.; Ward-Lonergan, Jeannene M.

    This study compared and analyzed the language capabilities of 10 school-age children raised in either single parent homes resulting from divorce or in two parent families. More specifically, it compared the context and complexity of oral personal event narratives produced by both groups of children. The study also investigated the usefulness and…

  13. Large solar flares - Analysis of the events recorded by the Mont Blanc neutrino detector

    NASA Astrophysics Data System (ADS)

    Aglietta, M.; Badino, G.; Bologna, G.; Castagnoli, C.; Castellina, A.; Dadykin, V. L.; Fulgione, W.; Galeotti, P.; Kalchukov, F. F.; Korolkova, I. V.; Kortchaguin, P. V.; Kudryavtsev, V. A.; Malguin, A. S.; Periale, L.; Ryassny, V. G.; Ryazhskaya, O. G.; Saavedra, O.; Trinchero, G.; Vernetto, S.; Yakushev, V. F.; Zatsepin, G. T.

    1991-11-01

    Analytical results are discussed from events recorded by the Mont Blanc neutrino detector during 19 large solar flares from August 1988 to March 1990, including the powerful flares of September 29 and October 19, 1989. It is found that no significant neutrino signal coincides temporally with solar flares. Upper limits are obtained for the integral neutrino and antineutrino flux of different flavors.

  14. The Successful Resolution of Armed Hostage/Barricade Events in Schools: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Daniels, Jeffrey A.; Bradley, Mary C.; Cramer, Daniel P.; Winkler, Amy J.; Kinebrew, Kisha; Crockett, Deleska

    2007-01-01

    This article explores the perceptions and reactions of school and law enforcement personnel in the successful resolution of armed hostage and barricade events in schools. A total of 12 individuals from three schools were interviewed to determine (1) their salient roles related to the situations, (2) facilitative systemic conditions, (3) to what…

  15. Meta-Analysis of Suicide-Related Behavior Events in Patients Treated with Atomoxetine

    ERIC Educational Resources Information Center

    Bangs, Mark E.; Tauscher-Wisniewski, Sitra; Polzer, John; Zhang, Shuyu; Acharya, Nayan; Desaiah, Durisala; Trzepacz, Paula T.; Allen, Albert J.

    2008-01-01

    A study to examine suicide-related events in acute, double-blind, and placebo controlled trials with atomoxetine is conducted. Results conclude that the incidences of suicide were more frequent in children suffering from ADHD treated with atomoxetine as compared to those treated with placebo.

  16. Plasma properties from the multi-wavelength analysis of the November 1st 2003 CME/shock event.

    PubMed

    Benna, Carlo; Mancuso, Salvatore; Giordano, Silvio; Gioannini, Lorenzo

    2013-05-01

    The analysis of the spectral properties and dynamic evolution of a CME/shock event observed on November 1st 2003 in white-light by the LASCO coronagraph and in the ultraviolet by the UVCS instrument operating aboard SOHO, has been performed to compute the properties of some important plasma parameters in the middle corona below about 2R ⊙. Simultaneous observations obtained with the MLSO/Mk4 white-light coronagraph, providing both the early evolution of the CME expansion in the corona and the pre-shock electron density profile along the CME front, were also used to study this event. By combining the above information with the analysis of the metric type II radio emission detected by ground-based radio spectrographs, we finally derive estimates of the values of the local Alfvén speed and magnetic field strength in the solar corona.

  17. Geohazard assessment through the analysis of historical alluvial events in Southern Italy

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo

    2015-04-01

    The risk associated with extreme water events such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood events over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow events have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm events typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic events repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.

  18. Development and application of a multi-targeting reference plasmid as calibrator for analysis of five genetically modified soybean events.

    PubMed

    Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao

    2015-04-01

    Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events.

  19. The Use of Qualitative Comparative Analysis for Critical Event Research in Alcohol and HIV in Mumbai, India

    PubMed Central

    Chandran, Devyani; Singh, S. K.; Berg, Marlene; Singh, Sharad; Gupta, Kamla

    2010-01-01

    In this paper we use Qualitative Comparative Analysis (QCA) in critical event analysis to identify under what conditions alcohol is necessary in contributing to unprotected sex. The paper is based on a set of in-depth interviews with 84 men aged 18 = 29 from three typical low income communities in Mumbai who reported using alcohol and having sex with at least one nonspousal partner once or more in the 30 days prior to the interview. The interviews included narratives of critical events defined as recent (past 30–60 day) events involving sexual behavior with or without alcohol. The paper identifies themes related to alcohol, sexuality and condom use, uses QCA to identify and explain configurations leading to protected and unprotected sex, and explains the differences. The analysis shows that alcohol alone is not sufficient to explain any cases involving unprotected sex but alcohol in combination with partner type and contextual factors does explain unprotected sex for subsets of married and unmarried men. PMID:20563636

  20. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    SciTech Connect

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  1. Analysis of Microphysics Mechanisms in Icing Aircraft Events: A Case Study

    NASA Astrophysics Data System (ADS)

    Sanchez, Jose Luis; Fernández, Sergio; Gascón, Estibaliz; Weigand, Roberto; Hermida, Lucia; Lopez, Laura; García-Ortega, Eduardo

    2013-04-01

    The appearance of Supercooled Large Drops (SLD) can give way to icing aircraft. In these cases, atmospheric icing causes an unusual loss of support on the aircraft due to the rapid accumulation of ice on the wings or measurement instruments. There are two possible ways that SLD can be formed: The first is through a process called "warm nose", followed by "resupercooling". This process is usually associated with the entrance of warm fronts. The second possibility is that drops are formed by the process of condensation, and they grow, to sizes of at least 50 µm through processes of collision-coalescence, in environments with temperatures inferior to 0°C at all times, but without being able to produce a freezing process. Some authors point out that approximately 75% of gelling precipitation events are produced as a consequence of this second situation. Within the framework of the TECOAGUA Project, a series of scientific flights were performed in order to collect data in cloud systems capable of producing precipitation during the winter period and their capacity to create environments favorable to "icing aircraft". These flights were carried out making use of a C 212-200 aircraft, belonging to the National Institute of Aerospatial Techniques (INTA), with a CAPS installed. On 1 February 2012, the C 212-200 aircraft took off from the airport in Torrejón de Ardoz (Madrid), flying about 70 km to stand upright on the northern side of the Central System, finding itself at a flight level of 3500 m, an elevated concentration of SLD at temperatures around -12°C, with liquid water content up to 0.44 g/m3, which provoked the accumulation of ice on the outline of the aircraft's wings, which required a cancellation of the flight. Surrounding the flight area, a microwave radiometer (MWR) was installed. An area of instability between 750 hPa and 600 hPa was identified in the vertical MWR profiles of temperature and humidity during the hour of the flight. It is mainly in this

  2. From event analysis to global lessons: disaster forensics for building resilience

    NASA Astrophysics Data System (ADS)

    Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard

    2016-07-01

    With unprecedented growth in disaster risk, there is an urgent need for enhanced learning and understanding of disasters, particularly in relation to the trends in drivers of increasing risk. Building on the disaster forensics field, we introduce the post-event review capability (PERC) methodology for systematically and holistically analysing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalisable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilise the freely available PERC approach and contribute to building a repository of learning on disaster risk management and resilience.

  3. From event analysis to global lessons: disaster forensics for building resilience

    NASA Astrophysics Data System (ADS)

    Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard

    2016-04-01

    With unprecedented growth in disaster risk, there is an urgent need for enhanced learning about and understanding disasters, particularly in relation to the trends in the drivers of increasing risk. Building on the disaster forensics field, we introduce the Post Event Review Capability (PERC) methodology for systematically and holistically analyzing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalizable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilize the freely available PERC approach and contribute to building a repository of learnings on disaster risk management and resilience.

  4. Parental separation and child aggressive and internalizing behavior: an event history calendar analysis.

    PubMed

    Averdijk, Margit; Malti, Tina; Eisner, Manuel; Ribeaud, Denis

    2012-04-01

    This study investigated the relationship between parental separation and aggressive and internalizing behavior in a large sample of Swiss children drawn from the ongoing Zurich Project on the Social Development of Children and Youths. Parents retrospectively reported life events and problem behavior for the first 7 years of the child's life on a quarterly basis (N = 995; 28,096 time points) using an Event History Calendar. The time sequences of separation and child problem behavior were analyzed. Parental separation affected both aggressive and internalizing behavior even when maternal depression, financial difficulties, and parental conflict were included. Parental separation exerted a direct effect on child problem behavior as well as an indirect effect via maternal depression.

  5. The South American rainfall dipole: A complex network analysis of extreme events

    NASA Astrophysics Data System (ADS)

    Boers, Niklas; Rheinwalt, Aljoscha; Bookhagen, Bodo; Barbosa, Henrique M. J.; Marwan, Norbert; Marengo, José; Kurths, Jürgen

    2014-10-01

    Intraseasonal rainfall variability of the South American monsoon system is characterized by a pronounced dipole between southeastern South America and southeastern Brazil. Here we analyze the dynamical properties of extreme rainfall events associated with this dipole by combining a nonlinear synchronization measure with complex networks. We make the following main observations: (i) Our approach reveals the dominant synchronization pathways of extreme events for the two dipole phases, (ii) while extreme rainfall synchronization in the tropics is directly driven by the trade winds and their deflection by the Andes mountains, extreme rainfall propagation in the subtropics is mainly dictated by frontal systems, and (iii) the well-known rainfall dipole is, in fact, only the most prominent mode of an oscillatory pattern that extends over the entire continent. This provides further evidence that the influence of Rossby waves, which cause frontal systems over South America and impact large-scale circulation patterns, extends beyond the equator.

  6. a Database for On-Line Event Analysis on a Distributed Memory Machine

    NASA Astrophysics Data System (ADS)

    Argante, E.; Meesters, M. R. J.; van der Stok, P.; Willers, I.

    Parallel in-memory databases can enhance the structuring and parallelization of programs used in High Energy Physics (HBP). Efficient database access routines are used as communication primitives which hide the communication topology in contrast to the more explicit communications like PVM or MPI. A parallel in-memory database, called SPIDER, has been implemented on a 32 node Meiko CS-2 distributed memory machine. The SPIDER primitives generate a lower overhead than the one generated by PVM or MPI. The event reconstruction program, CPREAD, of the CPLEAR experiment, has been used as a test case. Performance measurements showed that CPREAD interfaced to SPIDER can easily cope with the event rate generated by CPLEAR.

  7. Event Analysis in Nuclear Emulsion for the E07 Experiment at J-PARC

    NASA Astrophysics Data System (ADS)

    Soe, Myint K.; Endo, Yoko; Hoshino, Kaoru; Ito, Hiroki; Itonaga, Kazunori; Kobayashi, Hidetaka; Tint, Khin T.; Kinbara, Shinji; Mishina, Akihiro; Yoshida, Junya; Nakazawa, Kazuma

    Hammer track events in nuclear emulsion were analyzed to measure the excitation energy of 8Be* (2+) nucleus. The kinetic energies of two alpha particles of hammer track events were obtained from their ranges with use of range-energy relation. The range-energy relation was calibrated by measuring the alpha particle tracks emitted from 212Po of Thorium decay series in the emulsion. From this calibration, we obtained density and shrinkage factor of the emulsion. The excitation energy and width of 8Be* (2+) nucleus were measured to be 3.39 ± 0.24 MeV and Γ = 1.22 ± 0.60 MeV respectively.

  8. ANALYSIS ON RECENT FLOOD EVENTS AND TREE VEGETATION COLLAPSES IN KAKO RIVER

    NASA Astrophysics Data System (ADS)

    Michioku, Kohji; Miyamoto, Hitoshi; Kanda, Keiichi; Ohchi, Yohei; Aga, Kazuho; Morioka, Jyunji; Uotani, Takuya; Yoshida, Kazuaki; Yoshimura, Satoshi

    Forestation on flood plains is a world-wide engineering issue in middle to downstream reaches in many rivers. This brings not only degradation of flow conveyance capacity but also irreversible changes of ecological system in rivers. In order to obtain information on tree vegetation behavior during flood events, field data of flow fields and tree vegetation collapse were collected in Kako River, where willows are heavily vegetated on the flood plain. After starting a H-ADCP flow measurement in 2009, small to medium size flood events frequently occurred, which enables us not only to verify an analytical model to reproduce flow fields in and out of vegetations but also to examine tree vegetation collapses after flooding. The analytical solutions on velocity profiles as well as flow force acting on trees were in good agreement with the H-ADCP measurements and tree damages, respectively.

  9. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  10. Factor analysis of safety for visitors to a mega-event.

    PubMed

    Kwon, Young Guk; Park, Hyun Jee

    2002-01-01

    This paper investigated the safety factors considered by visitors to the Kwangju Biennale 2000 and analyzed the correlation between the safety factors and the demographic characteristics of the visitors. Global tourism increased throughout the 1990s, with the biggest surge occurring in the Asia-Pacific region. Long-distance travel is also increasing, and at a rate faster than the global average. The opportunities for event tourism appear to be strong almost everywhere, even though recessions may have an impact on these destinations. Along with this upward trend, competition for more desirable tourists is also surging (Getz, 1997). Therefore event tourism is appearing as a powerful method in the fierce competition around the tourism industry.

  11. On the identification of piston slap events in internal combustion engines using tribodynamic analysis

    NASA Astrophysics Data System (ADS)

    Dolatabadi, N.; Theodossiades, S.; Rothberg, S. J.

    2015-06-01

    Piston slap is a major source of vibration and noise in internal combustion engines. Therefore, better understanding of the conditions favouring piston slap can be beneficial for the reduction of engine Noise, Vibration and Harshness (NVH). Past research has attempted to determine the exact position of piston slap events during the engine cycle and correlate them to the engine block vibration response. Validated numerical/analytical models of the piston assembly can be very useful towards this aim, since extracting the relevant information from experimental measurements can be a tedious and complicated process. In the present work, a coupled simulation of piston dynamics and engine tribology (tribodynamics) has been performed using quasi-static and transient numerical codes. Thus, the inertia and reaction forces developed in the piston are calculated. The occurrence of piston slap events in the engine cycle is monitored by introducing six alternative concepts: (i) the quasi-static lateral force, (ii) the transient lateral force, (iii) the minimum film thickness occurrence, (iv) the maximum energy transfer, (v) the lubricant squeeze velocity and (vi) the piston-impact angular duration. The validation of the proposed methods is achieved using experimental measurements taken from a single cylinder petrol engine in laboratory conditions. The surface acceleration of the engine block is measured at the thrust- and anti-thrust side locations. The correlation between the theoretically predicted events and the measured acceleration signals has been satisfactory in determining piston slap incidents, using the aforementioned concepts. The results also exhibit good repeatability throughout the set of measurements obtained in terms of the number of events occurring and their locations during the engine cycle.

  12. Analysis of a Mesoscale Model for Depicting Rain-on-Snow Flooding Events in Mountainous Terrain

    NASA Astrophysics Data System (ADS)

    Morehead, M. D.; Dawson, P.; Seyfried, M. S.

    2002-12-01

    Cold season rain-on-snow events are one of the major sources of flooding in the Pacific Northwest. Accurate modeling of the atmospheric fields forcing these events is leading to a better understanding of the atmospheric conditions behind these events and to better prediction of these floods. A mesoscale atmospheric model (RAMS) with nested grids is being used for high resolution simulations of winter precipitation and other climate variables in the Owyhee mountains of southwestern Idaho. The Reynolds Creek Experimental Watershed (RCEW) contains a dense array of meteorologic and hydrologic instrumentation with which to test the spatial and temporal hydrologic and atmospheric models. The large number of precipitation gauges in the RCEW cover a wide range of precipitation zones found in mountainous terrain. These gauges allow for a thorough assessment of the areal distribution and timing of modeled versus measured precipitation and temperature. A comparison of the modeled and measured data from two winter storms associated with rain-on-snow events shows close agreement in the spatial and temporal distributions of precipitation, temperature and other variables. The model correctly predicts the spatial distribution of precipitation and the temporal conversion from snow to rain-on-snow in the lower elevations of the watershed. The modeled precipitation is typically slightly lower than the measured values. Some of the high frequency (hourly) weather variability was not captured by the model, presumably due to lack of sufficient data in the initialization process. The longer term goal is to develop a tool for generating detailed weather information for winter time hydrologic studies including cold season flooding processes and to better understand the processes controlling winter flooding.

  13. Analysis of extreme wave events in the southern coast of Brazil

    NASA Astrophysics Data System (ADS)

    Guimarães, P. V.; Farina, L.; Toldo, E.

    2014-06-01

    Using the model SWAN, high waves on the Southwestern Atlantic generated by extra-tropical cyclones are simulated from 2000 to 2010 and their impact on the Rio Grande do Sul coast is studied. The modeled waves are compared with buoy data and good agreement is found. The six extreme events in the period which presented significant wave heights above 5 m, on a particular point of interest, are investigated in detail. It is found that the cyclogenetic pattern between the latitudes 31.5 and 34° S, is the most favorable for developing high waves. Hovmöller diagrams for deep water show that the region between the south of Rio Grande do Sul up to latitude 31.5° S is the most energetic during a cyclone's passage, although the event of May 2008 indicate that the location of this region can vary, depending on the cyclone's displacement. On the oher hand, the Hovmöller diagrams for shallow water show that the different shoreface morphologies were responsable for focusing or dissipating the waves' energy; the regions found are in agreement with the observations of erosion and progradation regions. It can be concluded that some of the urban areas of the beaches of Hermenegildo, Cidreira, Pinhal, Tramandaí, Imbé and Torres have been more exposed during the extreme wave events at Rio Grande do Sul coast, and are more vulnerable to this natural hazard.

  14. Analysis of extreme wave events on the southern coast of Brazil

    NASA Astrophysics Data System (ADS)

    Guimarães, P. V.; Farina, L.; Toldo, E. E., Jr.

    2014-12-01

    Using the wave model SWAN (simulating waves nearshore), high waves on the southwestern Atlantic generated by extra-tropical cyclones are simulated from 2000 to 2010, and their impact on the Rio Grande do Sul (RS) coast is studied. The modeled waves are compared with buoy data and good agreement is found. The six extreme events in the period that presented significant wave heights above 5 m, on a particular point of interest, are investigated in detail. It is found that the cyclogenetic pattern between the latitudes 31.5 and 34° S is the most favorable for developing high waves. Hovmöller diagrams for deep water show that the region between the south of Rio Grande do Sul up to a latitude of 31.5° S is the most energetic during a cyclone's passage, although the event of May 2008 indicates that the location of this region can vary, depending on the cyclone's displacement. On the other hand, the Hovmöller diagrams for shallow water show that the different shoreface morphologies were responsible for focusing or dissipating the waves' energy; the regions found are in agreement with the observations of erosion and progradation regions. It can be concluded that some of the urban areas of the beaches of Hermenegildo, Cidreira, Pinhal, Tramandaí, Imbé and Torres have been more exposed during the extreme wave events on the Rio Grande do Sul coast, and are more vulnerable to this natural hazard.

  15. Analysis of a localized flash-flood event over the central Mediterranean

    NASA Astrophysics Data System (ADS)

    Gascón, E.; Laviola, S.; Merino, A.; Miglietta, M. M.

    2016-12-01

    On 3 July 2006, an exceptionally heavy convective rainfall affected a small area in Calabria, Italy. A rainfall amount of 202 mm was recorded in 2.5 h, producing considerable damage and causing a localized flash flood. The Weather Research and Forecasting (WRF) model was used to analyze the instability present in the event and the related triggering mechanisms. The high-resolution simulation is able to correctly identify the position of the precipitation peak and to clarify the mesoscale processes involved, although it significantly underestimates the total amount of precipitation. Some sensitivity experiments confirm the importance of the choice of planetary boundary layer and microphysics parameterization schemes for a correct simulation of the event, showing a strong sensitivity to these numerical tests. Also, the need for high horizontal resolution emerges clearly: an accurate representation of the orography at small scales, is required to simulate the event in its correct location. Instability indices identified an extremely favorable environment for convection development, with very high values of CAPE and high moisture content at low levels. The low mountains near the rainfall peak play an important role in triggering the release of instability and controlling the location of rainfall; in particular, the peculiar morphology of the orography creates low-level wind convergence and provides the uplift necessary for the air parcels to reach the level of free convection. In this framework, nondimensional parameters, such as the Froude number, have been calculated to better understand the interaction of the flow with the orography.

  16. Analysis of inter-event times for avalanches on a conical bead pile with cohesion

    NASA Astrophysics Data System (ADS)

    Lehman, Susan; Johnson, Nathan; Tieman, Catherine; Wainwright, Elliot

    2015-03-01

    We investigate the critical behavior of a 3D conical bead pile built from uniform 3 mm steel spheres. Beads are added to the pile by dropping them onto the apex one at a time; avalanches are measured through changes in pile mass. We investigate the dynamic response of the pile by recording avalanches from the pile over tens of thousands of bead drops. We have previously shown that the avalanche size distribution follows a power law for beads dropped onto the pile apex from a low drop height. We are now tuning the critical behavior of the system by adding cohesion from a uniform magnetic field and find an increase in both size and number for very large avalanches and decreases in the mid-size avalanches. The resulting bump in the avalanche distribution moves to larger avalanche size as the cohesion in the system is increased. We compare the experimental inter-event time distribution to both the Brownian passage-time and Weibull distributions, and observe a shift from the Weibull to Brownian passage-time as we raise the threshold from measuring time between events of all sizes to time between only the largest system-spanning events. These results are both consistent with those from a mean-field model of slip avalanches in a shear system [Dahmen, Nat Phys 7, 554 (2011)].

  17. Analysis of Severe Weather Events by Integration of Civil Protection Operation Data

    NASA Astrophysics Data System (ADS)

    Heisterkamp, Tobias; Kox, Thomas

    2015-04-01

    In Germany, winter storms belong to those natural hazards responsible for the largest damages (GDV 2014). This is a huge challenge for the civil protection, especially in metropolitan areas like Berlin. Nowadays, large-scale storm events are generally well predictable, but detailed forecasts on urban district or even street level are still out of range. Fire brigades, as major stakeholder covering severe weather consequences, operate on this small scale and in the whole area due to their juris-diction. For forensic investigation of disasters this presentation offers an additional approach by using the documentation of fire brigade operations as a new data source. Hazard dimensions and conse-quences of severe weather events are reconstructed via GIS-based analyses of these operations. Local case studies of recent storms are used as a comparison and as an additional information to three WMO weather stations in Berlin. Thus, hot spots of these selected events can be identified by operation site accumulations. Further indicators for Berlin are added to detect aspects that de-termine vulnerabilities. The conclusion discusses the potential of this approach as well as possible benefits of integration into warning systems.

  18. A Spectral Analysis of a Rare "Dwarf Eat Dwarf" Cannibalism Event

    NASA Astrophysics Data System (ADS)

    Theakanath, Kuriakose; Toloba, E.; Guhathakurta, P.; Romanowsky, A. J.; Ramachandran, N.; Arnold, J.

    2014-01-01

    We have used Keck/DEIMOS to conduct the first detailed spectroscopic study of the recently discovered stellar stream in the Large Magellanic Cloud analog NGC 4449. Martinez-Delgado et al. (2012), using the tip of the red giant branch (TRGB), found that both objects, the stream and NGC 4449, are at the same distance, which suggests that this stream is the remnant of the first ongoing dwarf-dwarf cannibalism event known so far. Learning about the orbital properties of this event is a powerful tool to constrain the physical conditions involved in dwarf-dwarf merger events. The low surface-brightness of this structure makes impossible to obtain integrated light spectroscopic measurements, and its distance (3.8 Mpc) is too large as to observe stars individually. In the color-magnitude diagram of the stellar stream there is an excess of objects brighter than the TRGB which are potential star blends. We designed our DEIMOS mask to contain as many of these objects as possible and, while some of them turned out to be background galaxies, a handful happened to be star blends in the stream. Our velocity measurements along the stream prove that it is gravitationally bound to NGC 4449 and put strong constraints on the orbital properties of the infall. This research was carried out under the auspices of UCSC's Science Internship Program. We thank the National Science Foundation for funding support. ET was supported by a Fulbright fellowship.

  19. Association between hormone replacement therapy and subsequent arterial and venous vascular events: a meta-analysis

    PubMed Central

    Sare, Gillian M.; Gray, Laura J.; Bath, Philip M.W.

    2008-01-01

    Aims Randomized controlled trials (RCTs) have shown that the risk of stroke and venous thromboembolism (VTE) is increased with hormone replacement therapy (HRT); the effect on coronary heart disease (CHD) remains unclear. Methods and results RCTs of HRT were identified. Event rates for cerebrovascular disease [stroke, TIA (transient ischaemic attack)], CHD (myocardial infarction, unstable angina, sudden cardiac death), and VTE (pulmonary embolism, deep vein thrombosis) were analysed. Sensitivity analyses were performed by type of HRT (mono vs. dual) and subject age. 31 trials (44 113 subjects) were identified. HRT was associated with increases in stroke (odds ratio, OR, 1.32, 95% confidence intervals, CI, 1.14–1.53) and VTE (OR 2.05, 95% CI 1.44–2.92). In contrast, CHD events were not increased (OR 1.02, 95% CI 0.90–1.11). Ordinal analyses confirmed that stroke severity was increased with HRT (OR 1.31, 95% CI 1.12–1.54). Although most trials included older subjects, age did not significantly affect risk. The addition of progesterone to oestrogen doubled the risk of VTE. Conclusion HRT is associated with an increased risk of stroke, stroke severity, and VTE, but not of CHD events. Although most trials studied older patients, increased risk was not related to age. Combined HRT increases the risk of VTE compared with oestrogen monotherapy. PMID:18599555

  20. [Characteristic analysis of a multi-day pollution event in Chang-Zhu-Tan Metropolitan Area during October 2013].

    PubMed

    Liao, Zhi-heng; Fan, Shao-jia; Huang, Juan; Sun, Jia-ren

    2014-11-01

    Chang-Zhu-Tan Metropolitan Area experienced a typical multi-day pollution event in October 2013. Based on the air pollution index, conventional pollutants observations, surface meteorological observations and sounding data, the relationships of air pollution, large-scale circumfluence and boundary layer meteorology of this event were comprehensively analyzed. Additionally, the sources and transport paths of pollutions were investigated by application of satellite remote sensing data and HYSPLIT4 model. The results showed that pollutants gradually accumulated in the earlier stage of the event (October 21th to 26th) , while in the later stage (October 27th to 31th) the characteristic pollutants of crop residue burning (PM2.5, CO, NO2) sharply increased. The deterioration of air quality in the later stage was mainly related to the remote transport of pollutants caused by straw burning. Analysis of simulations of HYSPLIT4 model and fire spots showed that the currents mainly came from Anhui and Hubei Province in the earlier stage, while in the later stage they were mainly from Jiangxi Province where fire spots were intensively located. Stable atmospheric stratification caused by steady uniform high-pressure field and slight wind due to the confrontation of cold and warm currents greatly contributed to the development, maintainability and reinforcement of the pollution event. The remote transport of pollutants had a significant impact on ambient air quality of Chang-Zhu-Tan Metropolitan Area.

  1. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    NASA Technical Reports Server (NTRS)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  2. Spatiotemporal characteristics of form analysis in the human visual cortex revealed by rapid event-related fMRI adaptation.

    PubMed

    Kourtzi, Zoe; Huberle, Elisabeth

    2005-11-01

    The integration of local elements to coherent forms is at the core of understanding visual perception. Accumulating evidence suggests that both early retinotopic and higher occipitotemporal areas contribute to the integration of local elements to global forms. However, the spatiotemporal characteristics of form analysis in the human visual cortex remain largely unknown. The aim of this study was to investigate form analysis at different spatial (global vs. local structure) and temporal (different stimulus presentation rates) scales across stages of visual analysis (from V1 to the lateral occipital complex-LOC) in the human brain. We used closed contours rendered by Gabor elements and manipulated either the global contour structure or the orientation of the local Gabor elements. Our rapid event-related fMRI adaptation studies suggest that contour integration and form processing in early visual areas is transient and limited within the local neighborhood of their cells' receptive field. In contrast, higher visual areas appear to process the perceived global form in a more sustained manner. Finally, we demonstrate that these spatiotemporal properties of form processing in the visual cortex are modulated by attention. Attention to the global form maintains sustained processing in occipitotemporal areas, whereas attention to local elements enhances their integration in early visual areas. These findings provide novel neuroimaging evidence for form analysis at different spatiotemporal scales across human visual areas and validate the use of rapid event-related fMRI adaptation for investigating processing across stages of visual analysis in the human brain.

  3. Forensic Analysis of Seismic Events in the Water; Submarines, Explosions and Impacts

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.; Koper, K. D.

    2002-12-01

    Sudden pressure changes in a water column can generate significant seismic energy that may be recorded on land based seismometers. In recent years a number of accidents and chemical explosions in the ocean or large lakes have been recorded at teleseismic distances, affording the opportunity to investigate the seismic source. The August 2000 sinking of the Russian attack submarine Kursk is the most famous example of an accident at sea that seismology played a role in understanding, but many other exist including: (1) the 1989 sinking and apparent implosion of the Soviet submarine Komsomoltes, (2) the sudden sinking of a large oil drilling platform in the North Sea in 1991, (3) the 1972 explosion and sinking of a 700 ton cargo ship off the coast of southwestern England, and (4) the crash of Swissair Flight 111 off the coast of Nova Scotia in 1998. Enough empirical information has been collected to accurately characterize the size of most of these underwater (or on the surface of the water) events. Further, many of the seismic signals contain a spectral scalloping that can be interpreted as either as reverberation of seismic energy in the water column or bubble pulses from underwater explosions. This information can be used to constrain the details of the seismic source. For example, the Kursk explosion had a pronounced spectral scalloping with a 1.45 Hz banding. Using a relationship between bubble pulse frequency, explosive yield and depth of detonation (the relationship was developed and verified using a large population of chemical explosions in the 1940s), the Kursk detonation is estimated to be at a depth of 85-100 m, with a yield of 3-5 tonnes equivalent TNT. This seismic result was confirmed almost exactly by the Russian government with the release of the official accident on the Kursk in August 2002. Seismic events in the water column can be rich sources of information about the details of the source. Events as small as magnitude 1.2 are routinely recorded by

  4. Effect of Statins on Venous Thromboembolic Events: A Meta-analysis of Published and Unpublished Evidence from Randomised Controlled Trials

    PubMed Central

    Rahimi, Kazem; Bhala, Neeraj; Kamphuisen, Pieter; Emberson, Jonathan; Biere-Rafi, Sara; Krane, Vera; Robertson, Michele; Wikstrand, John; McMurray, John

    2012-01-01

    Background It has been suggested that statins substantially reduce the risk of venous thromboembolic events. We sought to test this hypothesis by performing a meta-analysis of both published and unpublished results from randomised trials of statins. Methods and Findings We searched MEDLINE, EMBASE, and Cochrane CENTRAL up to March 2012 for randomised controlled trials comparing statin with no statin, or comparing high dose versus standard dose statin, with 100 or more randomised participants and at least 6 months' follow-up. Investigators were contacted for unpublished information about venous thromboembolic events during follow-up. Twenty-two trials of statin versus control (105,759 participants) and seven trials of an intensive versus a standard dose statin regimen (40,594 participants) were included. In trials of statin versus control, allocation to statin therapy did not significantly reduce the risk of venous thromboembolic events (465 [0.9%] statin versus 521 [1.0%] control, odds ratio [OR] = 0.89, 95% CI 0.78–1.01, p = 0.08) with no evidence of heterogeneity between effects on deep vein thrombosis (266 versus 311, OR 0.85, 95% CI 0.72–1.01) and effects on pulmonary embolism (205 versus 222, OR 0.92, 95% CI 0.76–1.12). Exclusion of the trial result that provided the motivation for our meta-analysis (JUPITER) had little impact on the findings for venous thromboembolic events (431 [0.9%] versus 461 [1.0%], OR = 0.93 [95% CI 0.82–1.07], p = 0.32 among the other 21 trials). There was no evidence that higher dose statin therapy reduced the risk of venous thromboembolic events compared with standard dose statin therapy (198 [1.0%] versus 202 [1.0%], OR = 0.98, 95% CI 0.80–1.20, p = 0.87). Risk of bias overall was small but a certain degree of effect underestimation due to random error cannot be ruled out. Please see later in the article for the Editors' Summary. Conclusions The findings from this meta-analysis do not support the

  5. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique. [CETAT computer program

    SciTech Connect

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program.

  6. Bayesian inference on risk differences: an application to multivariate meta-analysis of adverse events in clinical trials.

    PubMed

    Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng

    2013-05-01

    Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.

  7. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  8. Remote sensing analysis of the Tiber River sediment plume (Tyrrhenian Sea): spectral signature of erratic vs. persistent events

    NASA Astrophysics Data System (ADS)

    Falcini, Federico; Di Cicco, Annalisa; Pitarch, Jaime; Marullo, Salvatore; Colella, Simone; Volpe, Gianluca; Nardin, William; Margiotta, Francesca; Santoleri, Rosalia

    2016-04-01

    During the last decade, several regions along the western Tyrrhenian coast have been dramatically affected by intense river runoffs, which delivered a significant amount of sediment off and along shore. A crucial question that coastal geomorphologists and marine scientists need to face is about the fate and impact of this impulsive sediment load, especially with respect to the historical trend, seasonal variability, and persistent events. A satellite-based analysis of these sediment discharges is a key ingredient for such a study since it represents the primary dataset for the recognition of coastal patterns of Total Suspended Matter (TSM) that may reflect erosional or depositional processes along the coats. On this regard, we developed and implemented a TSM regional product from remote sensing, which was calibrated and validated by in situ measurements collected in the Tyrrhenian Sea. We discuss spatial patterns and spectral signature of the TSM that we observe during the 2012 high river discharge event of the Tiber River. Our analysis gives some insights on the main differences of the geomorphological impacts related to erratic vs persistent events.

  9. Synoptic-mesoscale analysis and numerical modeling of a tornado event on 12 February 2010 in northern Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, I. T.; Nastos, P. T.; Pytharoulis, I.

    2011-07-01

    Tornadoes are furious convective weather phenomena, with the maximum frequency over Greece during the cold period (autumn, winter).This study analyzes the tornado event that occurred on 12 February 2010 near Vrastama village, at Chalkidiki's prefecture, a non urban area 45 km southeast of Thessaloniki in northern Greece. The tornado developed approximately between 17:10 and 17:35 UTC and was characterized as F2 (Fujita Scale). The tornado event caused several damages to an industrial building and at several olive-tree farms. A synoptic survey is presented along with satellite images, radar products and vertical profile of the atmosphere. Additionally, the nonhydrostatic WRF-ARW atmospheric numerical model (version 3.2.0) was utilized in analysis and forecast mode using very high horizontal resolution (1.333 km × 1.333 km) in order to represent the ambient atmospheric conditions. A comparison of statistical errors between WRF-ARW forecasts and ECMWF analysis is presented, accompanied with LGTS 12:00 UTC soundings (Thessaloniki Airport) and forecast soundings in order to verify the WRF-ARW model. Additionally, a comparison between WRF-ARW and ECMWF thermodynamic indices is also presented. The WRF-ARW high spatial resolution model appeared to simulate with significant accuracy a severe convective event with a lead period of 18 h.

  10. Identification of synoptic precursors to extreme precipitation events in the Swiss Alps by the analysis of backward trajectories

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Horton, P.; Jaboyedoff, M.

    2015-12-01

    One of the most expensive natural disasters in Switzerland consists in floods related to heavy precipitation. The occurrence of heavy rainfall may induce landslides and debris flows, as observed during three major precipitation events that occurred recently in the Swiss Alps (August 1987, September 1993 and October 2000). Even though all these events took place under a southerly circulation, especially in autumn, not all southerly circulations lead to heavy precipitation. Although many studies tried to understand them, they are still difficult to forecast. Therefore, this work aims to identify synoptic precursors to such events throughout backward trajectories analysis. Part one tests as many combinations of tools, datasets and methods as possible in order to compare the trajectories in the case of heavy precipitations in the Alps and to reduce the number of models to be assessed for the second part by selecting the most relevant. As a result, we removed models yielding to similar results by using an absolute horizontal transport deviation measure (ATEH). The trajectories were processed with tools (HYSPLIT, a Matlab script developed at the University of Lausanne, and METEX) based on different Reanalysis (NCEP/NCAR, ECMWF, Japanese and NASA). Moreover, different types of trajectories (3D, isobaric, isentropic, isosigma, and constant density) have been used. As a result, 21 trajectory models were compared, and 9 were selected. Results show that most of the differences between trajectories are mainly related to the dataset rather than to the model. In part two, the 9 selected models were used to search precursors leading to heavy precipitations. 10-days backward trajectories were processed for the Binn station at different pressure levels, for all the days between 1961 and 2014 characterized by a southerly circulation in autumn. Based on these trajectories, two analysis for the identification of precursors were conducted. First, the ATEH was used to assess

  11. Analysis of the March 30, 2011 Hail Event at Shuttle Launch Pad 39A

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Doesken, Nolan J.; Kasparis, Takis C.; Sharp, David W.

    2012-01-01

    The Kennedy Space Center (KSC) Hail Monitor System, a joint effort of the NASA KSC Physics Lab and the KSC Engineering Services Contract (ESC) Applied Technology Lab, was first deployed for operational testing in the fall of 2006. Volunteers from the Community Collaborative Rain, Hail, and Snow Network (CoCoRaHS) in conjunction with Colorado State University have been instrumental in validation testing using duplicate hail monitor systems at sites in the hail prone high plains of Colorado. The KSC Hail Monitor System (HMS), consisting of three stations positioned approximately 500 ft from the launch pad and forming an approximate equilateral triangle, as shown in Figure 1, was first deployed to Pad 39B for support of STS-115. Two months later, the HMS was deployed to Pad 39A for support of STS-116. During support of STS-117 in late February 2007, an unusually intense (for Florida standards) hail event occurred in the immediate vicinity of the exposed space shuttle and launch pad. Hail data of this event was collected by the HMS and analyzed. Support of STS-118 revealed another important application of the hail monitor system. Ground Instrumentation personnel check the hail monitors daily when a vehicle is on the launch pad, with special attention after any storm suspected of containing hail. If no hail is recorded by the HMS, the vehicle and pad inspection team has no need to conduct a thorough inspection of the vehicle immediately following a storm. On the afternoon of July 13, 2007, hail on the ground was reported by observers at the Vertical Assembly Building (VAB) and Launch Control Center (LCC), about three miles west of Pad 39A, as well as at several other locations at KSC. The HMS showed no impact detections, indicating that the shuttle had not been damaged by any of the numerous hail events which occurred on that day.

  12. Likely frost events at Gale crater: Analysis from MSL/REMS measurements

    NASA Astrophysics Data System (ADS)

    Martínez, G. M.; Fischer, E.; Rennó, N. O.; Sebastián, E.; Kemppinen, O.; Bridges, N.; Borlina, C. S.; Meslin, P.-Y.; Genzer, M.; Harri, A.-H.; Vicente-Retortillo, A.; Ramos, M.; de la Torre Juárez, M.; Gómez, F.; Gómez-Elvira, J.

    2016-12-01

    We provide indirect evidence for the formation of frost at the surface of Gale crater by analyzing the highest confidence data from simultaneous measurements of relative humidity and ground temperature during the first 1000 sols of the Mars Science Laboratory (MSL) mission. We find that except for sol 44, frost events could have occurred only between sols 400 and 710, corresponding to the most humid and coldest time of the year (from early fall to late winter). In particular, measurements at Dingo Gap during sols 529-535, at an unnamed place during sols 554-560, at Kimberley during sols 609-617 and at an unnamed place during sols 673-676 showed the largest likelihood of the occurrence of frost events. At these four locations, the terrain is composed of fine-grained and loosely packed material with thermal inertia values of ∼200 SI units, much lower than the 365 ± 50 SI units value found at the landing ellipse. This is important because terrains with exceptionally low thermal inertia favor the formation of frost by lowering minimum daily ground temperatures. An order-of-magnitude calculation to determine the thickness of the frost layer at these four locations results in values of tenths of μm, while the precipitable water content is a few pr-μm. Therefore, surface frost events can have important implications for the local water cycle at Gale crater. In addition, frost is the most likely type of water that can be temporarily found in bulk amounts on the surface of Mars at low latitudes and therefore can cause weathering, influencing the geology of Gale crater.

  13. Spatial analysis of a large magnitude erosion event following a Sierran wildfire.

    PubMed

    Carroll, Erin M; Miller, Wally W; Johnson, Dale W; Saito, Laurel; Qualls, Robert G; Walker, Roger F

    2007-01-01

    High intensity wildfire due to long-term fire suppression and heavy fuels buildup can render watersheds highly susceptible to wind and water erosion. The 2002 "Gondola" wildfire, located just southeast of Lake Tahoe, NV-CA, was followed 2 wk later by a severe hail and rainfall event that deposited 7.6 to 15.2 mm of precipitation over a 3 to 5 h time period. This resulted in a substantive upland ash and sediment flow with subsequent down-gradient riparian zone deposition. Point measurements and ESRI ArcView were applied to spatially assess source area contributions and the extent of ash and sediment flow deposition in the riparian zone. A deposition mass of 380 Mg of ash and sediment over 0.82 ha and pre-wildfire surface bulk density measurements were used in conjunction with two source area assessments to generate an estimation of 10.1 mm as the average depth of surface material eroded from the upland source area. Compared to previous measurements of erosion during rainfall simulation studies, the erosion of 1800 to 6700 g m(-2) mm(-1) determined from this study was as much as four orders of magnitude larger. Wildfire, followed by the single event documented in this investigation, enhanced soil water repellency and contributed 17 to 67% of the reported 15 to 60 mm ky(-1) of non-glacial, baseline erosion rates occurring in mountainous, granitic terrain sites in the Sierra Nevada. High fuel loads now common to the Lake Tahoe Basin increase the risk that similar erosion events will become more commonplace, potentially contributing to the accelerated degradation of Lake Tahoe's water clarity.

  14. Depth Analysis of Historic Seismicity Using Intensity Data With Special Reference to Arizona Events

    NASA Astrophysics Data System (ADS)

    Brumbaugh, D. S.

    2002-12-01

    Intensity derived focal depths can be a useful parameter for historic tremors because of the lack of constraint from instrumental data. The well known Bath equation: Io=3log[(r2+h2)/h2]+2 tends to give foci that are too deep. A newer modified equation: Io=2.8log[(2.79Av/pi+h2)/h2]+2-6/1.2h+(6-Io)/6, was empirically developed from a data base of 24 earthquakes that had well constrained instrumental depths and intensity maps. The average difference between instrumental depths and modified equation depths was 3 kilometers. Half of the events differed from the instrumental depth by 2 kilometers or less. The modified equation was applied to tremors of M5.0 or greater in Arizona that occurred between 1906 and 1959. These tremors had no depth estimates or ones that were inaccurate. A calibration test of the modified equation was performed on a 1976 Arizona event that had instrumental depth estimates of 10-15 kilometers. The intensity depth from the modified equation was 11-12 kilometers. The modified equation was then used to calculate depths for the following earthquakes in northern Arizona: 1-25-1906, M6.2, depth=30 \\pm 3 kilometers; 8-18-1912, M6.2,depth=20 \\pm 3 kilometers; 7-25-1959,M5.75, depth=11 \\pm 3 kilometers. The 1959 tremor had a published depth of 0.5 kilometers from instrumental data. The intensity derived depth of 11 kilometers appears more reasonable. The 1906 and 1912 tremors did not have instrumentally determined depths due to lack of data. The mid to lower crustal depths of these two events suggests a thick seismogenic layer, unusual in continental areas. A cold, thick Colorado Plateau crust and upper mantle is documented by other tremors of more than 40 kilometers depth and relatively low heat-flow values.

  15. MARSIS Data Bad Time Stamp: Analysis and Solution of an Anomaly Event in a Space Mission

    NASA Astrophysics Data System (ADS)

    Giuppi, S.; Cartacci, M.; Cicchetti, A.; Frigeri, A.; Noschese, R.; Orosei, R.

    2012-04-01

    Mars Express is Europe's first spacecraft to the Red Planet. The spacecraft has been orbiting Mars since December 2003, carrying a suite of instruments that are investigating many scientific aspects of this planet in unprecedented detail. The observations are particularly focused on martian atmosphere, surface and subsurface. The most innovative instrument on board of Mars Express is MARSIS, a subsurface radar sounder with a 40-meter antenna. The main objective of MARSIS is to look for water from the martian surface down to about 5 kilometers below the surface. It provides the first opportunity to detect liquid water directly. It is also able to characterize the surface elevation, roughness, and radar reflectivity of the planet and to study the interaction of the atmosphere and solar wind in the red planet's ionosphere. MARSIS Data are stored on the on-board memory and periodically sent to Earth ground stations. Spacecraft Event Time (SCET) is the time an event occurs in relation to a spacecraft as measured by the spacecraft clock. Since it takes time for a radio transmission to reach the spacecraft from the earth, the usual operation of a spacecraft is done via an uploaded commanding script containing SCET markers to ensure a certain timeline of events. Occasionally the generation time (SCET) of the MARSIS science packets recorded during an observation gets corrupted. This means that while some of the data have the correct SCET, some other data have a SCET not compliant with the effective generation time. For this reason with the standard procedure it is possible to retrieve only partial data. In this paper we describe the cause of the anomaly occurrence and the procedures to be applied depending on the circumstances that arise. The application of these procedures is been successful and allowed to circumvent the problem.

  16. Recurrent event analysis of lapse and recovery in a smoking cessation clinical trial using bupropion.

    PubMed

    Wileyto, E Paul; Patterson, Freda; Niaura, Raymond; Epstein, Leonard H; Brown, Richard A; Audrain-McGovern, Janet; Hawk, Larry W; Lerman, Caryn

    2005-04-01

    We report a reanalysis of data from a prior study describing the event history of quitting smoking aided by bupropion, using recurrent-event models to determine the effect of the drug on occurrence of lapses and recoveries from lapse (resumption of abstinence). Data were collected on 1,070 subjects across two similar double-blind randomized clinical trials of bupropion versus placebo and fitted with separate Cox regression models for lapse and recovery. Analyses were split using discrete time-varying covariates between the treatment (weeks 1-10) and follow-up phases (end of treatment to 12 months). Bupropion was associated with slower lapse during treatment for both sexes, and being female was associated with faster lapse across both phases. Drug did not affect time to recovery for males but was associated with faster recovery among females, allowing women to recover as quickly as men. High levels of nicotine dependence did not affect time to lapse but were associated with slower recovery from lapse across treatment and follow-up phases. During the treatment phase, higher levels of baseline depression symptoms had no effect on time to lapse but were associated with slower recovery from lapse. Results highlight the asymmetry in factors preventing lapse versus promoting recovery. Specifically, dependence, depression symptoms, and a sex x drug interaction were found to affect recovery but not lapse. Further research disentangling lapse and recovery events from summary abstinence measures is needed to help us develop interventions that take advantage of bupropion at its best and that compensate where it is weak.

  17. Markovian Statistical Data Analysis of Single-Event Upsets Triggered by High Intensity Neutrons

    NASA Technical Reports Server (NTRS)

    Lakdawala, Anushka V.; Zhang, Hong; Gonzalex, Oscar R.; Gray, W. Steven

    2006-01-01

    This paper analyzes data from a single-event upset experiment conducted at the Los Alamos National Laboratory. Statistical tools, based on well-known x(sup 2) hypothesis testing theory, are used to determine if sequences of upsets can be modeled as a homogeneous Markov chain of a specific order. The experiment consisted of radiating a new experimental flight control computer (FCC) with a high intensity neutron beam while the FCC controlled a simulation of a Boeing 737. The analyzed data is a sequence of states that indicates when the FCC is under an upset condition.

  18. The Strong Wind event of 24th January 2009 in Catalonia: a social impact analysis

    NASA Astrophysics Data System (ADS)

    Amaro, J.; Aran, M.; Barberia, L.; Llasat, M. C.

    2009-09-01

    Although strong winds are frequent in Catalonia, one of the events with the strongest impact in recent years was on January 24th 2009. An explosive cyclogenesis process took place in the Atlantic: pressure fell 30 hPa in less than 24 hours. The strong wind storm pounded the northern of Spain and the south of France with some fatalities and important economic losses in these regions. Several automatic weather stations recorded wind gusts higher than 100 km/h in Catalonia. Emergency services received more than 20.000 calls in 24 hours and there were 497 interventions in only 12 hours. As a consequence of fallen and uprooted trees railway and road infrastructures got damages and more than 30.000 customers had no electricity during 24 hours. Unfortunately there were a total of 6 fatalities, two of them because of fallen trees and the other ones when a sports centre collapsed over a group of children. In Spain, insurance policies cover damages due to strong winds when fixed thresholds are overcome and, according to the Royal Decree 300/2004 of 20th February, extraordinary risk are assumed by the Consorcio de Compensación de Seguros. Subsequently, Public Weather Services (PWS) had an increased on the number of requests received from people affected by this event and from insurance companies, for the corresponding indemnity or not. As an example, during the first month after the event, in the Servei Meteorològic de Catalunya (SMC) more than 600 requests were received only related to these damages (as an average PWS of SMC received a total of 400 requests per month). Following the research started by the Social Impact Research Group of MEDEX project, a good vulnerability indicator of a meteorological risk can be the number of requests reported. This study uses the information received in the PWS of the SMC during the six months after the event, according the criteria and methodology established in Gayà et al (2008). The objective is to compare the vulnerability with the

  19. Advanced Geospatial Hydrodynamic Signals Analysis for Tsunami Event Detection and Warning

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, Banafshe; Sabeur, Zoheir

    2013-04-01

    Current early tsunami warning can be issued upon the detection of a seismic event which may occur at a given location offshore. This also provides an opportunity to predict the tsunami wave propagation and run-ups at potentially affected coastal zones by selecting the best matching seismic event from a database of pre-computed tsunami scenarios. Nevertheless, it remains difficult and challenging to obtain the rupture parameters of the tsunamigenic earthquakes in real time and simulate the tsunami propagation with high accuracy. In this study, we propose a supporting approach, in which the hydrodynamic signal is systematically analysed for traces of a tsunamigenic signal. The combination of relatively low amplitudes of a tsunami signal at deep waters and the frequent occurrence of background signals and noise contributes to a generally low signal to noise ratio for the tsunami signal; which in turn makes the detection of this signal difficult. In order to improve the accuracy and confidence of detection, a re-identification framework in which a tsunamigenic signal is detected via the scan of a network of hydrodynamic stations with water level sensing is performed. The aim is to attempt the re-identification of the same signatures as the tsunami wave spatially propagates through the hydrodynamic stations sensing network. The re-identification of the tsunamigenic signal is technically possible since the tsunami signal at the open ocean itself conserves its birthmarks relating it to the source event. As well as supporting the initial detection and improving the confidence of detection, a re-identified signal is indicative of the spatial range of the signal, and thereby it can be used to facilitate the identification of certain background signals such as wind waves which do not have as large a spatial reach as tsunamis. In this paper, the proposed methodology for the automatic detection of tsunamigenic signals has been achieved using open data from NOAA with a recorded

  20. Bayesian analysis of recurrent event with dependent termination: an application to a heart transplant study.

    PubMed

    Ouyang, Bichun; Sinha, Debajyoti; Slate, Elizabeth H; Van Bakel, Adrian B

    2013-07-10

    For a heart transplant patient, the risk of graft rejection and risk of death are likely to be associated. Two fully specified Bayesian models for recurrent events with dependent termination are applied to investigate the potential relationships between these two types of risk as well as association with risk factors. We particularly focus on the choice of priors, selection of the appropriate prediction model, and prediction methods for these two types of risk for an individual patient. Our prediction tools can be easily implemented and helpful to physicians for setting heart transplant patients' biopsy schedule.

  1. Data analysis of MOA for Gravitational Microlensing events with durations Less than 2 days by using brown dwarf population

    NASA Astrophysics Data System (ADS)

    Hassani, Sh.

    2016-12-01

    Gravitational Microlensing is one of the most powerful methods of detecting very low mass objects like Exoplanets and Brown dwarfs. The most important parameter that we can extract from a microlensing event is the Einstein radius crossing time tE. In this work, by performing Monte-Carlo simulation, we obtain tE distribution for brown dwarf population. Then we show that this population can be a good candidate for very short microlensing events with tE<2 days. The data set used in this analysis was taken in 2006 and 2007 seasons by the MOA-II survey, using the 1.8-m MOA-II telescope located at the Mt. John University Observatory, New Zealand.

  2. Incidence of adverse events in paediatric procedural sedation in the emergency department: a systematic review and meta-analysis

    PubMed Central

    Bellolio, M Fernanda; Puls, Henrique A; Anderson, Jana L; Gilani, Waqas I; Murad, M Hassan; Barrionuevo, Patricia; Erwin, Patricia J; Wang, Zhen; Hess, Erik P

    2016-01-01

    Objective and design We conducted a systematic review and meta-analysis to evaluate the incidence of adverse events in the emergency department (ED) during procedural sedation in the paediatric population. Randomised controlled trials and observational studies from the past 10 years were included. We adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Setting ED. Participants Children. Interventions Procedural sedation. Outcomes Adverse events like vomiting, agitation, hypoxia and apnoea. Meta-analysis was performed with random-effects model and reported as incidence rates with 95% CIs. Results A total of 1177 studies were retrieved for screening and 258 were selected for full-text review. 41 studies reporting on 13 883 procedural sedations in 13 876 children (≤18 years) were included. The most common adverse events (all reported per 1000 sedations) were: vomiting 55.5 (CI 45.2 to 65.8), agitation 17.9 (CI 12.2 to 23.7), hypoxia 14.8 (CI 10.2 to 19.3) and apnoea 7.1 (CI 3.2 to 11.0). The need to intervene with either bag valve mask, oral airway or positive pressure ventilation occurred in 5.0 per 1000 sedations (CI 2.3 to 7.6). The incidences of severe respiratory events were: 34 cases of laryngospasm among 8687 sedations (2.9 per 1000 sedations, CI 1.1 to 4.7; absolute rate 3.9 per 1000 sedations), 4 intubations among 9136 sedations and 0 cases of aspiration among 3326 sedations. 33 of the 34 cases of laryngospasm occurred in patients who received ketamine. Conclusions Serious adverse respiratory events are very rare in paediatric procedural sedation in the ED. Emesis and agitation are the most frequent adverse events. Hypoxia, a late indicator of respiratory depression, occurs in 1.5% of sedations. Laryngospasm, though rare, happens most frequently with ketamine. The results of this study provide quantitative risk estimates to facilitate shared decision-making, risk communication, informed consent and

  3. Work stress and the risk of recurrent coronary heart disease events: A systematic review and meta-analysis.

    PubMed

    Li, Jian; Zhang, Min; Loerbroks, Adrian; Angerer, Peter; Siegrist, Johannes

    2015-01-01

    Though much evidence indicates that work stress increases the risk of incident of coronary heart disease (CHD), little is known about the role of work stress in the development of recurrent CHD events. The objective of this study was to review and synthesize the existing epidemiological evidence on whether work stress increases the risk of recurrent CHD events in patients with the first CHD. A systematic literature search in the PubMed database (January 1990 - December 2013) for prospective studies was performed. Inclusion criteria included: peer-reviewed English papers with original data, studies with substantial follow-up (> 3 years), end points defined as cardiac death or nonfatal myocardial infarction, as well as work stress assessed with reliable and valid instruments. Meta-analysis using random-effects modeling was conducted in order to synthesize the observed effects across the studies. Five papers derived from 4 prospective studies conducted in Sweden and Canada were included in this systematic review. The measurement of work stress was based on the Demand- Control model (4 papers) or the Effort-Reward Imbalance model (1 paper). According to the estimation by meta-analysis based on 4 papers, a significant effect of work stress on the risk of recurrent CHD events (hazard ratio: 1.65, 95% confidence interval: 1.23-2.22) was observed. Our findings suggest that, in patients with the first CHD, work stress is associated with an increased relative risk of recurrent CHD events by 65%. Due to the limited literature, more well-designed prospective research is needed to examine this association, in particular, from other than western regions of the world.

  4. Analysis of damaging hydrogeological events: the case of the Calabria Region (Southern Italy).

    PubMed

    Petrucci, O; Polemio, M; Pasqua, A A

    2009-03-01

    A period of bad weather conditions due to prolonged intense rainfall and strong winds can trigger landslides, floods, secondary floods (accumulation of rain on surfaces with low permeability), and sea storms, causing damage to humans and infrastructure. As a whole, these periods of bad weather and triggered phenomena can be defined as damaging hydrogeological events (DHEs). We define a methodological approach based on seven simple indexes to analyze such events. The indexes describe the return period (T) and trend of rainfall, the extent of hit areas, and the level of damages; they can be considered attributes of georeferenced features and analyzed with GIS techniques. We tested our method in an Italian region frequently hit by DHEs. In a period of 10 years, 747 damaging phenomena (landslides, 43%; floods, 38%) and 94 DHEs have been classified. The road network and housing areas are the most frequently damaged elements, threatened by all types of damaging phenomena. T classes are almost in accordance with the level of damage. These results can be used to outline warning levels for civil protection purposes, to forecast the areas most likely to be hit and the potential ensuing damage, to disseminate information concerning vulnerable areas, and to increase people's awareness of risk.

  5. Landscape-scale analysis of wetland sediment deposition from four tropical cyclone events.

    PubMed

    Tweel, Andrew W; Turner, R Eugene

    2012-01-01

    Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between events. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone events and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils.

  6. An analysis of strong wind events simulated in a GCM near Casey in the Antarctic

    SciTech Connect

    Murphy, B.F.; Simmonds, I. )

    1993-02-01

    Strong wind events occurring near Casey (Antarctica) in a long July GCM simulation have been studied to determine the relative roles played by the synoptic situation and the katabatic flow in producing these episodes. It has been found that the events are associated with strong katabatic and strong gradient flow operating together. Both components are found to increase threefold on average for these strong winds, and although the geostrophic flow is the stronger, it rarely produces strong winds without katabatic flow becoming stronger than it is in the mean. The two wind components do not flow in the same direction; indeed, there is some cancellation between them, since katabatic flow acts in a predominant downslope direction, while the geostrophic wind acts across slope. The stronger geostrophic flow is associated with higher-than-average pressures over the continent and the approach of a strong cyclonic system toward the coast and a blocking system downstream. The anomalous synoptic patterns leading up to the occasions display a strong wavenumber 4 structure. The very strong katabatic flow appears to be related to the production of a supply of cold air inland from Casey by the stronger-than-average surface temperature inversions inland a few days before the strong winds occur. The acceleration of this negatively buoyant air mass down the steep, ice-sheet escarpment results in strong katabatic flow near the coast. 24 refs., 11 figs.

  7. The Link Between Alcohol Use and Aggression Toward Sexual Minorities: An Event-Based Analysis

    PubMed Central

    Parrott, Dominic J.; Gallagher, Kathryn E.; Vincent, Wilson; Bakeman, Roger

    2010-01-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men’s alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18–30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexual minorities. Results indicated that aggression toward sexual minorities was twice as likely on a day when drinking was reported than on non-drinking days, with over 80% of alcohol-related aggressive acts perpetrated within the group context. Patterns of alcohol use (i.e., number of drinking days, mean drinks per drinking day, number of heavy drinking days) were not associated with perpetration after controlling for demographic variables and pertinent risk factors. Results suggest that it is the acute effects of alcohol, and not men’s patterns of alcohol consumption, that facilitate aggression toward sexual minorities. More importantly, these data are the first to support an event-based link between alcohol use and aggression toward sexual minorities (or any minority group), and provide the impetus for future research to examine risk factors and mechanisms for intoxicated aggression toward sexual minorities and other stigmatized groups. PMID:20853937

  8. Wavelet entropy analysis of event-related potentials indicates modality-independent theta dominance.

    PubMed

    Yordanova, Juliana; Kolev, Vasil; Rosso, Osvaldo A; Schürmann, Martin; Sakowitz, Oliver W; Ozgören, Murat; Basar, Erol

    2002-05-30

    Sensory/cognitive stimulation elicits multiple electroencephalogram (EEG)-oscillations that may be partly or fully overlapping over the time axis. To evaluate co-existent multi-frequency oscillations, EEG responses to unimodal (auditory or visual) and bimodal (combined auditory and visual) stimuli were analyzed by applying a new method called wavelet entropy (WE). The method is based on the wavelet transform (WT) and quantifies entropy of short segments of the event-related brain potentials (ERPs). For each modality, a significant transient decrease of WE emerged in the post-stimulus EEG epoch indicating a highly-ordered state in the ERP. WE minimum was always determined by a prominent dominance of theta (4-8 Hz) ERP components over other frequency bands. Event-related 'transition to order' was most pronounced and stable at anterior electrodes, and after bimodal stimulation. Being consistently observed across different modalities, a transient theta-dominated state may reflect a processing stage that is obligatory for stimulus evaluation, during which interfering activations from other frequency networks are minimized.

  9. Climate change impact and uncertainty analysis of extreme rainfall events in the Apalachicola River basin, Florida

    NASA Astrophysics Data System (ADS)

    Wang, Dingbao; Hagen, Scott C.; Alizad, Karim

    2013-02-01

    SummaryClimate change impact on rainfall intensity-duration-frequency (IDF) curves at the Apalachicola River basin (Florida Panhandle coast) is assessed using an ensemble of regional climate models (RCMs) obtained from the North American Regional Climate Change Assessment Program. The suitability of seven RCMs on simulating temporal variation of rainfall at the fine-scale is assessed for the case study region. Two RCMs, HRM3-HADCM3 and RCM3-GFDL, are found to have good skill scores in generating high intensity events at the mid-afternoon (2:00-4:00 PM). These two RCMs are selected for assessing potential climate change impact on IDF curves. Two methods are used to conduct bias correction on future rainfall IDF curves, i.e., maximum intensity percentile-based method, and sequential bias correction and maximum intensity percentile-based method. Based on the projection by HRM3-HADCM3, there is no significant change in rainfall intensity at the upstream and middle stream stations but higher intensity at the downstream station. RCM3-GFDL projected increased rainfall intensity from upstream to downstream, particularly at the downstream. The potential temporal shift of extreme rainfall events coupled with overall increased intensities may exacerbate flood magnitudes and lead to increased sediment and nutrient loadings to the estuary, especially in light of sea level change.

  10. Femtomolar detection of single mismatches by discriminant analysis of DNA hybridization events using gold nanoparticles.

    PubMed

    Ma, Xingyi; Sim, Sang Jun

    2013-03-21

    Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.

  11. The link between alcohol use and aggression toward sexual minorities: an event-based analysis.

    PubMed

    Parrott, Dominic J; Gallagher, Kathryn E; Vincent, Wilson; Bakeman, Roger

    2010-09-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men's alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18-30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexual minorities. Results indicated that aggression toward sexual minorities was twice as likely on a day when drinking was reported than on nondrinking days, with over 80% of alcohol-related aggressive acts perpetrated within the group context. Patterns of alcohol use (i.e., number of drinking days, mean drinks per drinking day, number of heavy drinking days) were not associated with perpetration after controlling for demographic variables and pertinent risk factors. Results suggest that it is the acute effects of alcohol, and not men's patterns of alcohol consumption, that facilitate aggression toward sexual minorities. More importantly, these data are the first to support an event-based link between alcohol use and aggression toward sexual minorities (or any minority group), and provide the impetus for future research to examine risk factors and mechanisms for intoxicated aggression toward sexual minorities and other stigmatized groups.

  12. Words Analysis of Online Chinese News Headlines about Trending Events: A Complex Network Perspective

    PubMed Central

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines’ keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words’ networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly. PMID:25807376

  13. Analysis of the 4-year IceCube high-energy starting events

    NASA Astrophysics Data System (ADS)

    Vincent, Aaron C.; Palomares-Ruiz, Sergio; Mena, Olga

    2016-07-01

    After four years of data taking, the IceCube neutrino telescope has detected 54 high-energy starting events (HESE, or contained-vertex events) with deposited energies above 20 TeV. They represent the first detection of high-energy extraterrestrial neutrinos and, therefore, the first step in neutrino astronomy. To study the energy, flavor, and isotropy of the astrophysical neutrino flux arriving at Earth, we perform different analyses of two different deposited energy intervals, [10 TeV-10 PeV] and [60 TeV-10 PeV]. We first consider an isotropic unbroken power-law spectrum and constrain its shape, normalization, and flavor composition. Our results are in agreement with the preliminary IceCube results, although we obtain a slightly softer spectrum. We also find that current data are not sensitive to a possible neutrino-antineutrino asymmetry in the astrophysical flux. Then, we show that although a two-component power-law model leads to a slightly better fit, it does not represent a significant improvement with respect to a single power-law flux. Finally, we analyze the possible existence of a north-south asymmetry, hinted at by the combination of the HESE sample with the throughgoing muon data. If we use only HESE data, the scarce statistics from the Northern Hemisphere does not allow us to reach any conclusive answer, which indicates that the HESE sample alone is not driving the potential north-south asymmetry.

  14. Words analysis of online Chinese news headlines about trending events: a complex network perspective.

    PubMed

    Li, Huajiao; Fang, Wei; An, Haizhong; Huang, Xuan

    2015-01-01

    Because the volume of information available online is growing at breakneck speed, keeping up with meaning and information communicated by the media and netizens is a new challenge both for scholars and for companies who must address public relations crises. Most current theories and tools are directed at identifying one website or one piece of online news and do not attempt to develop a rapid understanding of all websites and all news covering one topic. This paper represents an effort to integrate statistics, word segmentation, complex networks and visualization to analyze headlines' keywords and words relationships in online Chinese news using two samples: the 2011 Bohai Bay oil spill and the 2010 Gulf of Mexico oil spill. We gathered all the news headlines concerning the two trending events in the search results from Baidu, the most popular Chinese search engine. We used Simple Chinese Word Segmentation to segment all the headlines into words and then took words as nodes and considered adjacent relations as edges to construct word networks both using the whole sample and at the monthly level. Finally, we develop an integrated mechanism to analyze the features of words' networks based on news headlines that can account for all the keywords in the news about a particular event and therefore track the evolution of news deeply and rapidly.

  15. Landscape-Scale Analysis of Wetland Sediment Deposition from Four Tropical Cyclone Events

    PubMed Central

    Tweel, Andrew W.; Turner, R. Eugene

    2012-01-01

    Hurricanes Katrina, Rita, Gustav, and Ike deposited large quantities of sediment on coastal wetlands after making landfall in the northern Gulf of Mexico. We sampled sediments deposited on the wetland surface throughout the entire Louisiana and Texas depositional surfaces of Hurricanes Katrina, Rita, Gustav, and the Louisiana portion of Hurricane Ike. We used spatial interpolation to model the total amount and spatial distribution of inorganic sediment deposition from each storm. The sediment deposition on coastal wetlands was an estimated 68, 48, and 21 million metric tons from Hurricanes Katrina, Rita, and Gustav, respectively. The spatial distribution decreased in a similar manner with distance from the coast for all hurricanes, but the relationship with distance from the storm track was more variable between events. The southeast-facing Breton Sound estuary had significant storm-derived sediment deposition west of the storm track, whereas sediment deposition along the south-facing coastline occurred primarily east of the storm track. Sediment organic content, bulk density, and grain size also decreased significantly with distance from the coast, but were also more variable with respect to distance from the track. On average, eighty percent of the mineral deposition occurred within 20 km from the coast, and 58% was within 50 km of the track. These results highlight an important link between tropical cyclone events and coastal wetland sedimentation, and are useful in identifying a more complete sediment budget for coastal wetland soils. PMID:23185635

  16. Analysis of extrinsic and intrinsic factors affecting event related desynchronization production.

    PubMed

    Takata, Yohei; Kondo, Toshiyuki; Saeki, Midori; Izawa, Jun; Takeda, Kotaro; Otaka, Yohei; It, Koji

    2012-01-01

    Recently there has been an increase in the number of stroke patients with motor paralysis. Appropriate re-afferent sensory feedback synchronized with a voluntary motor intention would be effective for promoting neural plasticity in the stroke rehabilitation. Therefore, BCI technology is considered to be a promising approach in the neuro-rehabilitation. To estimate human motor intention, an event-related desynchronization (ERD), a feature of electroencephalogram (EEG) evoked by motor execution or motor imagery is usually used. However, there exists various factors that affect ERD production, and its neural mechanism is still an open question. As a preliminary stage, we evaluate mutual effects of intrinsic (voluntary motor imagery) and extrinsic (visual and somatosensory stimuli) factors on the ERD production. Experimental results indicate that these three factors are not always additively interacting with each other and affecting the ERD production.

  17. High Cadence Observations and Analysis of Spicular-type Events Using CRISP Onboard SST

    NASA Astrophysics Data System (ADS)

    Shetye, J.; Doyle, J. G.; Scullion, E.; Nelson, C. J.; Kuridze, D.

    2016-04-01

    We present spectroscopic and imaging observations of apparent ultra-fast spicule-like features observed with CRisp Imaging SpectroPolarimeter (CRISP) at the Swedish 1-m Solar Telescope (SST). The data shows spicules with an apparent velocity above 500 km s-1, very short lifetimes of up to 20 s and length/height around 3500 km. The spicules are seen as dark absorption structures in the Hα wings ±516 mÅ, ±774 mÅ and ±1032 mÅ which suddenly appear and disappear from the FOV. These features show a time delay in their appearance in the blue and red wings by 3-5 s. We suggest that their appearance/disappearance is due to their Doppler motion in and out of the 60 mÅ filter. See Fig. 1 for the evolution of the event at two line positions.

  18. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation.

  19. [Analysis of policies in activating the Infectious Disease Specialist Network (IDSN) for bioterrorism events].

    PubMed

    Kim, Yang Soo

    2008-07-01

    Bioterrorism events have worldwide impacts, not only in terms of security and public health policy, but also in other related sectors. Many countries, including Korea, have set up new administrative and operational structures and adapted their preparedness and response plans in order to deal with new kinds of threats. Korea has dual surveillance systems for the early detection of bioterrorism. The first is syndromic surveillance that typically monitors non-specific clinical information that may indicate possible bioterrorism-associated diseases before specific diagnoses are made. The other is infectious disease specialist network that diagnoses and responds to specific illnesses caused by intentional release of biologic agents. Infectious disease physicians, clinical microbiologists, and infection control professionals play critical and complementary roles in these networks. Infectious disease specialists should develop practical and realistic response plans for their institutions in partnership with local and state health departments, in preparation for a real or suspected bioterrorism attack.

  20. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-04-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T(2) statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  1. Assessing potential impacts associated with contamination events in water distribution systems : a sensitivity analysis.

    SciTech Connect

    Davis, M. J.; Janke, R.; Taxon, T. N.

    2010-11-01

    An understanding of the nature of the adverse effects that could be associated with contamination events in water distribution systems is necessary for carrying out vulnerability analyses and designing contamination warning systems. This study examines the adverse effects of contamination events using models for 12 actual water systems that serve populations ranging from about 104 to over 106 persons. The measure of adverse effects that we use is the number of people who are exposed to a contaminant above some dose level due to ingestion of contaminated tap water. For this study the number of such people defines the impact associated with an event. We consider a wide range of dose levels in order to accommodate a wide range of potential contaminants. For a particular contaminant, dose level can be related to a health effects level. For example, a dose level could correspond to the median lethal dose, i.e., the dose that would be fatal to 50% of the exposed population. Highly toxic contaminants may be associated with a particular response at a very low dose level, whereas contaminants with low toxicity may only be associated with the same response at a much higher dose level. This report focuses on the sensitivity of impacts to five factors that either define the nature of a contamination event or involve assumptions that are used in assessing exposure to the contaminant: (1) duration of contaminant injection, (2) time of contaminant injection, (3) quantity or mass of contaminant injected, (4) population distribution in the water distribution system, and (5) the ingestion pattern of the potentially exposed population. For each of these factors, the sensitivities of impacts to injection location and contaminant toxicity are also examined. For all the factors considered, sensitivity tends to increase with dose level (i.e., decreasing toxicity) of the contaminant, with considerable inter-network variability. With the exception of the population distribution (factor 4

  2. Glucagon-Like Peptide-1 Receptor Agonists and Cardiovascular Events: A Meta-Analysis of Randomized Clinical Trials

    PubMed Central

    Monami, Matteo; Cremasco, Francesco; Lamanna, Caterina; Colombi, Claudia; Desideri, Carla Maria; Iacomelli, Iacopo; Marchionni, Niccolò; Mannucci, Edoardo

    2011-01-01

    Objective. Data from randomized clinical trials with metabolic outcomes can be used to address concerns about potential issues of cardiovascular safety for newer drugs for type 2 diabetes. This meta-analysis was designed to assess cardiovascular safety of GLP-1 receptor agonists. Design and Methods. MEDLINE, Embase, and Cochrane databases were searched for randomized trials of GLP-1 receptor agonists (versus placebo or other comparators) with a duration ≥12 weeks, performed in type 2 diabetic patients. Mantel-Haenszel odds ratio with 95% confidence interval (MH-OR) was calculated for major cardiovascular events (MACE), on an intention-to-treat basis, excluding trials with zero events. Results. Out of 36 trials, 20 reported at least one MACE. The MH-OR for all GLP-1 receptor agonists was 0.74 (0.50–1.08), P = .12 (0.85 (0.50–1.45), P = .55, and 0.69 (0.40–1.22), P = .20, for exenatide and liraglutide, resp.). Corresponding figures for placebo-controlled and active comparator studies were 0.46 (0.25–0.83), P = .009, and 1.05 (0.63–1.76), P = .84, respectively. Conclusions. To date, results of randomized trials do not suggest any detrimental effect of GLP-1 receptor agonists on cardiovascular events. Specifically designed longer-term trials are needed to verify the possibility of a beneficial effect. PMID:21584276

  3. Metagenomic Analysis of Airborne Bacterial Community and Diversity in Seoul, Korea, during December 2014, Asian Dust Event

    PubMed Central

    Cha, Seho; Srinivasan, Sathiyaraj; Jang, Jun Hyeong; Lee, Dongwook; Lim, Sora; Kim, Kyung Sang; Jheong, Weonhwa; Lee, Dong-Won; Park, Eung-Roh; Chung, Hyun-Mi; Choe, Joonho; Kim, Myung Kyum; Seo, Taegun

    2017-01-01

    Asian dust or yellow sand events in East Asia are a major issue of environmental contamination and human health, causing increasing concern. A high amount of dust particles, especially called as particulate matter 10 (PM10), is transported by the wind from the arid and semi-arid tracks to the Korean peninsula, bringing a bacterial population that alters the terrestrial and atmospheric microbial communities. In this study, we aimed to explore the bacterial populations of Asian dust samples collected during November–December 2014. The dust samples were collected using the impinger method, and the hypervariable regions of the 16S rRNA gene were amplified using PCR followed by pyrosequencing. Analysis of the sequencing data were performed using Mothur software. The data showed that the number of operational taxonomic units and diversity index during Asian dust events were higher than those during non-Asian dust events. At the phylum level, the proportions of Proteobacteria, Actinobacteria, and Firmicutes were different between Asian dust and non-Asian dust samples. At the genus level, the proportions of the genus Bacillus (6.9%), Arthrobacter (3.6%), Blastocatella (2%), Planomicrobium (1.4%) were increased during Asian dust compared to those in non-Asian dust samples. This study showed that the significant relationship between bacterial populations of Asian dust samples and non-Asian dust samples in Korea, which could significantly affect the microbial population in the environment. PMID:28122054

  4. A methodology for interactive mining and visual analysis of clinical event patterns using electronic health record data.

    PubMed

    Gotz, David; Wang, Fei; Perer, Adam

    2014-04-01

    Patients' medical conditions often evolve in complex and seemingly unpredictable ways. Even within a relatively narrow and well-defined episode of care, variations between patients in both their progression and eventual outcome can be dramatic. Understanding the patterns of events observed within a population that most correlate with differences in outcome is therefore an important task in many types of studies using retrospective electronic health data. In this paper, we present a method for interactive pattern mining and analysis that supports ad hoc visual exploration of patterns mined from retrospective clinical patient data. Our approach combines (1) visual query capabilities to interactively specify episode definitions, (2) pattern mining techniques to help discover important intermediate events within an episode, and (3) interactive visualization techniques that help uncover event patterns that most impact outcome and how those associations change over time. In addition to presenting our methodology, we describe a prototype implementation and present use cases highlighting the types of insights or hypotheses that our approach can help uncover.

  5. A super-jupiter orbiting a late-type star: A refined analysis of microlensing event OGLE-2012-BLG-0406

    SciTech Connect

    Tsapras, Y.; Street, R. A.; Choi, J.-Y.; Han, C.; Bozza, V.; Gould, A.; Dominik, M.; Browne, P.; Horne, K.; Hundertmark, M.; Beaulieu, J.-P.; Udalski, A.; Jørgensen, U. G.; Sumi, T.; Bramich, D. M.; Kains, N.; Ipatov, S.; Alsubai, K. A.; Snodgrass, C.; Steele, I. A.; Collaboration: RoboNet Collaboration; MiNDSTEp Collaboration; OGLE Collaboration; PLANET Collaboration; μFUN Collaboration; MOA Collaboration; and others

    2014-02-10

    We present a detailed analysis of survey and follow-up observations of microlensing event OGLE-2012-BLG-0406 based on data obtained from 10 different observatories. Intensive coverage of the light curve, especially the perturbation part, allowed us to accurately measure the parallax effect and lens orbital motion. Combining our measurement of the lens parallax with the angular Einstein radius determined from finite-source effects, we estimate the physical parameters of the lens system. We find that the event was caused by a 2.73 ± 0.43 M {sub J} planet orbiting a 0.44 ± 0.07 M {sub ☉} early M-type star. The distance to the lens is 4.97 ± 0.29 kpc and the projected separation between the host star and its planet at the time of the event is 3.45 ± 0.26 AU. We find that the additional coverage provided by follow-up observations, especially during the planetary perturbation, leads to a more accurate determination of the physical parameters of the lens.

  6. The impact of economic austerity and prosperity events on suicide in Greece: a 30-year interrupted time-series analysis

    PubMed Central

    Branas, Charles C; Kastanaki, Anastasia E; Michalodimitrakis, Manolis; Tzougas, John; Kranioti, Elena F; Theodorakis, Pavlos N; Carr, Brendan G; Wiebe, Douglas J

    2015-01-01

    Objectives To complete a 30-year interrupted time-series analysis of the impact of austerity-related and prosperity-related events on the occurrence of suicide across Greece. Setting Greece from 1 January 1983 to 31 December 2012. Participants A total of 11 505 suicides, 9079 by men and 2426 by women, occurring in Greece over the study period. Primary and secondary outcomes National data from the Hellenic Statistical Authority assembled as 360 monthly counts of: all suicides, male suicides, female suicides and all suicides plus potentially misclassified suicides. Results In 30 years, the highest months of suicide in Greece occurred in 2012. The passage of new austerity measures in June 2011 marked the beginning of significant, abrupt and sustained increases in total suicides (+35.7%, p<0.001) and male suicides (+18.5%, p<0.01). Sensitivity analyses that figured in undercounting of suicides also found a significant, abrupt and sustained increase in June 2011 (+20.5%, p<0.001). Suicides by men in Greece also underwent a significant, abrupt and sustained increase in October 2008 when the Greek recession began (+13.1%, p<0.01), and an abrupt but temporary increase in April 2012 following a public suicide committed in response to austerity conditions (+29.7%, p<0.05). Suicides by women in Greece also underwent an abrupt and sustained increase in May 2011 following austerity-related events (+35.8%, p<0.05). One prosperity-related event, the January 2002 launch of the Euro in Greece, marked an abrupt but temporary decrease in male suicides (−27.1%, p<0.05). Conclusions This is the first multidecade, national analysis of suicide in Greece using monthly data. Select austerity-related events in Greece corresponded to statistically significant increases for suicides overall, as well as for suicides among men and women. The consideration of future austerity measures should give greater weight to the unintended mental health consequences that may follow and the public

  7. Comparative Analysis of Three Brevetoxin-Associated Bottlenose Dolphin (Tursiops truncatus) Mortality Events in the Florida Panhandle Region (USA)

    PubMed Central

    Twiner, Michael J.; Flewelling, Leanne J.; Fire, Spencer E.; Bowen-Stevens, Sabrina R.; Gaydos, Joseph K.; Johnson, Christine K.; Landsberg, Jan H.; Leighfield, Tod A.; Mase-Guthrie, Blair; Schwacke, Lori; Van Dolah, Frances M.; Wang, Zhihong; Rowles, Teresa K.

    2012-01-01

    In the Florida Panhandle region, bottlenose dolphins (Tursiops truncatus) have been highly susceptible to large-scale unusual mortality events (UMEs) that may have been the result of exposure to blooms of the dinoflagellate Karenia brevis and its neurotoxin, brevetoxin (PbTx). Between 1999 and 2006, three bottlenose dolphin UMEs occurred in the Florida Panhandle region. The primary objective of this study was to determine if these mortality events were due to brevetoxicosis. Analysis of over 850 samples from 105 bottlenose dolphins and associated prey items were analyzed for algal toxins and have provided details on tissue distribution, pathways of trophic transfer, and spatial-temporal trends for each mortality event. In 1999/2000, 152 dolphins died following extensive K. brevis blooms and brevetoxin was detected in 52% of animals tested at concentrations up to 500 ng/g. In 2004, 105 bottlenose dolphins died in the absence of an identifiable K. brevis bloom; however, 100% of the tested animals were positive for brevetoxin at concentrations up to 29,126 ng/mL. Dolphin stomach contents frequently consisted of brevetoxin-contaminated menhaden. In addition, another potentially toxigenic algal species, Pseudo-nitzschia, was present and low levels of the neurotoxin domoic acid (DA) were detected in nearly all tested animals (89%). In 2005/2006, 90 bottlenose dolphins died that were initially coincident with high densities of K. brevis. Most (93%) of the tested animals were positive for brevetoxin at concentrations up to 2,724 ng/mL. No DA was detected in these animals despite the presence of an intense DA-producing Pseudo-nitzschia bloom. In contrast to the absence or very low levels of brevetoxins measured in live dolphins, and those stranding in the absence of a K. brevis bloom, these data, taken together with the absence of any other obvious pathology, provide strong evidence that brevetoxin was the causative agent involved in these bottlenose dolphin mortality

  8. Quantitative analysis on windblown dust concentrations of PM10 (PM2.5) during dust events in Mongolia

    NASA Astrophysics Data System (ADS)

    Jugder, Dulam; Shinoda, Masato; Kimura, Reiji; Batbold, Altangerel; Amarjargal, Danzansambuu

    2014-09-01

    Dust concentration, wind speed and visibility, measured at four sites in the Gobi Desert and at a site in the steppe zone of Mongolia over a period of 4.5 years (January 2009 to May 2013), have been analyzed for their relationships, their effects on visibility, and for an estimate of the threshold wind necessary for dust emission in the region. Based on quantitative analysis on measurements, we evaluated that dust emission concentrations of 41-61 (20-24) μg m-3 of PM10 (PM2.5) are as the criterion between normal and hazy atmospheric conditions. With the arrival of dust events, wind-borne soil particulate matter (PM10, PM2.5) that originates in the Gobi Desert is changed dramatically. PM10 (PM2.5) concentrations increase by at least double or by several tens of times during severe dust events in comparison with the normal atmospheric condition. Ratio (PM2.5/PM10) between monthly means of PM10 and PM2.5 concentrations showed that anthropogenic particles were dominant in the ambient air of province centers in cool months (November to February). Threshold values of the onset of dust events were determined for PM10 (PM2.5) concentrations. According to the definition of dust storms, dust concentrations of PM10 corresponding to visibility of 1 km or less were determined at sites in the Gobi Desert and the steppe region. The threshold wind speeds during days with dust events were estimated at four sites in the Gobi Desert and compared each other. The threshold wind was higher at Sainshand and its cause might be due to smaller silt and clay fractions of soil.

  9. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria)

    PubMed Central

    Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.

    2014-01-01

    Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and

  10. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    PubMed

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  11. Subclinical Hypothyroidism and the Risk of Stroke Events and Fatal Stroke: An Individual Participant Data Analysis

    PubMed Central

    Chaker, Layal; Baumgartner, Christine; den Elzen, Wendy P. J.; Ikram, M. Arfan; Blum, Manuel R.; Collet, Tinh-Hai; Bakker, Stephan J. L.; Dehghan, Abbas; Drechsler, Christiane; Luben, Robert N.; Hofman, Albert; Portegies, Marileen L. P.; Medici, Marco; Iervasi, Giorgio; Stott, David J.; Ford, Ian; Bremner, Alexandra; Wanner, Christoph; Ferrucci, Luigi; Newman, Anne B.; Dullaart, Robin P.; Sgarbi, José A.; Ceresini, Graziano; Maciel, Rui M. B.; Westendorp, Rudi G.; Jukema, J. Wouter; Imaizumi, Misa; Franklyn, Jayne A.; Bauer, Douglas C.; Walsh, John P.; Razvi, Salman; Khaw, Kay-Tee; Cappola, Anne R.; Völzke, Henry; Franco, Oscar H.; Gussekloo, Jacobijn; Rodondi, Nicolas

    2015-01-01

    Objective: The objective was to determine the risk of stroke associated with subclinical hypothyroidism. Data Sources and Study Selection: Published prospective cohort studies were identified through a systematic search through November 2013 without restrictions in several databases. Unpublished studies were identified through the Thyroid Studies Collaboration. We collected individual participant data on thyroid function and stroke outcome. Euthyroidism was defined as TSH levels of 0.45–4.49 mIU/L, and subclinical hypothyroidism was defined as TSH levels of 4.5–19.9 mIU/L with normal T4 levels. Data Extraction and Synthesis: We collected individual participant data on 47 573 adults (3451 subclinical hypothyroidism) from 17 cohorts and followed up from 1972–2014 (489 192 person-years). Age- and sex-adjusted pooled hazard ratios (HRs) for participants with subclinical hypothyroidism compared to euthyroidism were 1.05 (95% confidence interval [CI], 0.91–1.21) for stroke events (combined fatal and nonfatal stroke) and 1.07 (95% CI, 0.80–1.42) for fatal stroke. Stratified by age, the HR for stroke events was 3.32 (95% CI, 1.25–8.80) for individuals aged 18–49 years. There was an increased risk of fatal stroke in the age groups 18–49 and 50–64 years, with a HR of 4.22 (95% CI, 1.08–16.55) and 2.86 (95% CI, 1.31–6.26), respectively (p trend 0.04). We found no increased risk for those 65–79 years old (HR, 1.00; 95% CI, 0.86–1.18) or ≥80 years old (HR, 1.31; 95% CI, 0.79–2.18). There was a pattern of increased risk of fatal stroke with higher TSH concentrations. Conclusions: Although no overall effect of subclinical hypothyroidism on stroke could be demonstrated, an increased risk in subjects younger than 65 years and those with higher TSH concentrations was observed. PMID:25856213

  12. Analysis of a creeping marls event in the coastal cliffs of Bessin, Basse-Normandie, France

    NASA Astrophysics Data System (ADS)

    Vioget, Alizée; Michoud, Clément; Jaboyedoff, Michel; Maquaire, Olivier; Costa, Stéphane; Davidson, Robert; Derron, Marc-Henri

    2015-04-01

    The cliffs' retreat is a major issue for the management of coastal territories. Two coastal areas in "Calvados" and "Pays de Caux", French Normandy, are studied. The Bessin cliff is about 4.3 km long and lies between the World War II artillery batteries of Longues-sur-Mer and Arromanches-les-Bains. On the coastline, the cliff's height varies between 10 and 75 meters above sea level. The site's lithology is mainly composed by two formations: the Bessin limestones lie on top of the Port marls, which act as an aquitard. More or less important water outflows are therefore observed at the contact between the marls and the limestone. For this communication, we aim to focus on a complex landslide that happened in May 2013 near Cape Manvieux, estimating volumes and modelling the landslide kinematics. For that purpose, some field observations and measurement have been made in order to make a realistic profile and to understand the steps which lead to this 27 m high and 110 m wide event. In addition, a terrestrial LiDAR (Optech Ilris3D) acquisition of the instability was performed in July 2013 and is compared with the Litto3D (the continued DEM over land and see) acquired in 2011 by the IGN. This comparison shows a maximum cliffs' retreat of about 27 m and 30'000 m3 and a deposit accumulation of about 8 m height. In addition, a limestone rock column of 2'000 m3 and 18 m height within the toppled deposits could still collapse in a short time. Up to now, these site-specific investigations, set in the context of instabilities within the entire study area, let us suppose that the current state of the instability was created by multiple successive events. The landslide could hence be caused by a complex mix of creeping marls conditioned by its water content and pressure induced by overlying formations and toppling of limestone destabilised by the formation of back subvertical crack due to limestone exhumation debuttressing.

  13. Extraversion and short-term memory for chromatic stimuli: an event-related potential analysis.

    PubMed

    Stauffer, Corinne C; Indermühle, Rebekka; Troche, Stefan J; Rammsayer, Thomas H

    2012-10-01

    The present study investigated extraversion-related individual differences in visual short-term memory (VSTM) functioning. Event related potentials were recorded from 50 introverts and 50 extraverts while they performed a VSTM task based on a color-change detection paradigm with three different set sizes. Although introverts and extraverts showed almost identical hit rates and reaction times, introverts displayed larger N1 amplitudes than extraverts independent of color change or set size. Extraverts also showed larger P3 amplitudes compared to introverts when there was a color change, whereas no extraversion-related difference in P3 amplitude was found in the no-change condition. Our findings provided the first experimental evidence that introverts' greater reactivity to punctuate physical stimulation, as indicated by larger N1 amplitude, also holds for complex visual stimulus patterns. Furthermore, P3 amplitude in the change condition was larger for extraverts than introverts suggesting higher sensitivity to context change. Finally, there were no extraversion-related differences in P3 amplitude dependent on set size. This latter finding does not support the resource allocation explanation as a source of differences between introverts and extraverts.

  14. Image analysis of single event transient effects on charge coupled devices irradiated by protons

    NASA Astrophysics Data System (ADS)

    Wang, Zujun; Xue, Yuanyuan; Liu, Jing; He, Baoping; Yao, Zhibin; Ma, Wuying

    2016-10-01

    The experiments of single event transient (SET) effects on charge coupled devices (CCDs) irradiated by protons are presented. The radiation experiments have been carried out at the accelerator protons with the energy of 200 MeV and 60 MeV.The incident angles of the protons are at 30°and 90° to the plane of the CCDs to obtain the images induced by the perpendicularity and incline incident angles. The experimental results show that the typical characteristics of the SET effects on a CCD induced by protons are the generation of a large number of dark signal spikes (hot pixels) which are randomly distributed in the "pepper" images. The characteristics of SET effects are investigated by observing the same imaging area at different time during proton radiation to verify the transient effects. The experiment results also show that the number of dark signal spikes increases with increasing integration time during proton radiation. The CCDs were tested at on-line and off-line to distinguish the radiation damage induced by the SET effects or DD effects. The mechanisms of the dark signal spike generation induced by the SET effects and the DD effects are demonstrated respectively.

  15. Algorithms for the analysis and characterization of convective structures relative to extreme rainfall events

    NASA Astrophysics Data System (ADS)

    Sabatino, Pietro; Fedele, Giuseppe; Procopio, Antonio; Chiaravalloti, Francesco; Gabriele, Salvatore

    2016-10-01

    Among many weather phenomena, convective storms are one of the most dangerous since they are able to cause, in a relatively small time window, great damages. Convective precipitations are in fact characterized by relatively small spatial and temporal scales, and as a consequence, the task of forecasting such phenomena turns out to be an elusive one. Nonetheless, given their dangerousness, the identification and tracking of meteorological convective systems are of paramount importance and are the subject of several studies. In particular, the early detection of the areas where deep convection is about to appear, and the prediction of the development and path of existing convective thunderstorms represent two focal research topics. The aim of the present work is to outline a framework employing various techniques apt to the task of monitoring and characterization of convective clouds. We analyze meteorological satellite images and data in order to evaluate the potential occurring of strong precipitation. Techniques considered include numerical, machine learning, image processing. The techniques are tested on data coming from real convective events captured in the last years on the Italian peninsula by the Meteosat meteorological satellites and weather radar.

  16. Pharmacovigilance and drug safety in Calabria (Italy): 2012 adverse events analysis

    PubMed Central

    Giofrè, Chiara; Scicchitano, Francesca; Palleria, Caterina; Mazzitello, Carmela; Ciriaco, Miriam; Gallelli, Luca; Paletta, Laura; Marrazzo, Giuseppina; Leporini, Christian; Ventrice, Pasquale; Carbone, Claudia; Saullo, Francesca; Rende, Pierandrea; Menniti, Michele; Mumoli, Laura; Chimirri, Serafina; Patanè, Marinella; Esposito, Stefania; Cilurzo, Felisa; Staltari, Orietta; Russo, Emilio; De Sarro, Giovambattista

    2013-01-01

    Introduction: Pharmacovigilance (PV) is designed to monitor drugs continuously after their commercialization, assessing and improving their safety profile. The main objective is to increase the spontaneous reporting of adverse drug reactions (ADRs), in order to have a wide variety of information. The Italian Drug Agency (Agenzia Italiana del Farmaco [AIFA]) is financing several projects to increase reporting. In Calabria, a PV information center has been created in 2010. Materials and Methods: We obtained data using the database of the National Health Information System AIFA relatively to Italy and Calabria in the year 2012. Descriptive statistics were performed to analyze the ADRs. Results: A total number of 461 ADRs have been reported in the year 2012 with an increase of 234% compared with 2011 (138 reports). Hospital doctors are the main source of this reporting (51.62%). Sorafenib (Nexavar®), the combination of amoxicillin/clavulanic acid and ketoprofen represent the drugs most frequently reported causing adverse reactions. Adverse events in female patients (61.83%) were more frequently reported, whereas the age groups “41-65” (39.07%) and “over 65” (27.9%) were the most affected. Conclusions: Calabria has had a positive increase in the number of ADRs reported, although it has not yet reached the gold standard set by World Health Organization (about 600 reports), the data have shown that PV culture is making inroads in this region and that PV projects stimulating and increasing PV knowledge are needed. PMID:24347984

  17. Metagenomic analysis of the coral holobiont during a natural bleaching event on the Great Barrier Reef.

    PubMed

    Littman, Raechel; Willis, Bette L; Bourne, David G

    2011-12-01

    Understanding the effects of elevated seawater temperatures on each member of the coral holobiont (the complex comprised of coral polyps and associated symbiotic microorganisms, including Bacteria, viruses, Fungi, Archaea and endolithic algae) is becoming increasingly important as evidence accumulates that microbial members contribute to overall coral health, particularly during thermal stress. Here we use a metagenomic approach to identify metabolic and taxonomic shifts in microbial communities associated with the hard coral Acropora millepora throughout a natural thermal bleaching event at Magnetic Island (Great Barrier Reef). A direct comparison of metagenomic data sets from healthy versus bleached corals indicated major shifts in microbial associates during heat stress, including Bacteria, Archaea, viruses, Fungi and micro-algae. Overall, metabolism of the microbial community shifted from autotrophy to heterotrophy, including increases in genes associated with the metabolism of fatty acids, proteins, simple carbohydrates, phosphorus and sulfur. In addition, the proportion of virulence genes was higher in the bleached library, indicating an increase in microorganisms capable of pathogenesis following bleaching. These results demonstrate that thermal stress results in shifts in coral-associated microbial communities that may lead to deteriorating coral health.

  18. Person perception precedes theory of mind: an event related potential analysis.

    PubMed

    Wang, Y W; Lin, C D; Yuan, B; Huang, L; Zhang, W X; Shen, D L

    2010-09-29

    Prior to developing an understanding of another person's mental state, an ability termed "theory of mind" (ToM), a perception of that person's appearance and actions is required. However the relationship between this "person perception" and ToM is unclear. To investigate the time course of ToM and person perception, event-related potentials (ERP) were recorded while 17 normal adults received three kinds of visual stimuli: cartoons involving people (person perception cartoons), cartoons involving people and also requiring ToM for comprehension (ToM cartoons), and scene cartoons. We hypothesized that the respective patterns of brain activation would be different under these three stimuli, at different stages in time. Our findings supported this proposal: the peak amplitudes of P200 for scene cartoons were significantly lower than for person perception or ToM cartoons, while there were no significant differences between the latter two for P200. During the 1000-1300 ms epoch, the mean amplitudes of the late positive components (LPC) for person perception were more positive than for scene representation, while the mean amplitudes of the LPC for ToM were more positive than for person perception. The present study provides preliminary evidence of the neural dynamic that underlies the dissociation between person perception and ToM.

  19. Analysis of locality-sensitive hashing for fast critical event prediction on physiological time series.

    PubMed

    Kim, Yongwook Bryce; O'Reilly, Una-May

    2016-08-01

    We apply the sublinear time, scalable locality-sensitive hashing (LSH) and majority discrimination to the problem of predicting critical events based on physiological waveform time series. Compared to using the linear exhaustive k-nearest neighbor search, our proposed method vastly speeds up prediction time up to 25 times while sacrificing only 1% of accuracy when demonstrated on an arterial blood pressure dataset extracted from the MIMIC2 database. We compare two widely used variants of LSH, the bit sampling based (L1LSH) and the random projection based (E2LSH) methods to measure their direct impact on retrieval and prediction accuracy. We experimentally show that the more sophisticated E2LSH performs worse than L1LSH in terms of accuracy, correlation, and the ability to detect false negatives. We attribute this to E2LSH's simultaneous integration of all dimensions when hashing the data, which actually makes it more impotent against common noise sources such as data misalignment. We also demonstrate that the deterioration of accuracy due to approximation at the retrieval step of LSH has a diminishing impact on the prediction accuracy as the speed up gain accelerates.

  20. Sensitivity analysis of potential events affecting the double-shell tank system and fallback actions

    SciTech Connect

    Knutson, B.J.

    1996-09-27

    Sensitivity analyses were performed for fall-back positions (i.e., management actions) to accommodate potential off-normal and programmatic change events overlaid on the waste volume projections and their uncertainties. These sensitivity analyses allowed determining and ranking tank system high-risk parameters and fall- back positions that will accommodate the respective impacts. This quantification of tank system impacts shows periods where tank capacity is sensitive to certain variables that must be carefully managed and/or evaluated. Identifying these sensitive variables and quantifying their impact will allow decision makers to prepare fall-back positions and focus available resources on the highest impact parameters where technical data are needed to reduce waste projection uncertainties. For noncomplexed waste, the period of capacity vulnerability occurs during the years of single-shell tank (SST) retrieval (after approximately 2009) due to the sensitivity to several variables. Ranked by importance these variables include the pretreatment rate and 200-East SST solids transfer volume. For complexed waste, the period of capacity vulnerability occurs during the period after approximately 2005 due to the sensitivity to several variables. Ranked by importance these variables include the pretreatment rate. 200-East SST solids transfer volume. complexed waste reduction factor using evaporation, and 200-west saltwell liquid porosity.

  1. Political Imprisonment and Adult Functioning: A Life Event History Analysis of Palestinians.

    PubMed

    McNeely, Clea; Barber, Brian K; Spellings, Carolyn; Belli, Robert; Giacaman, Rita; Arafat, Cairo; Daher, Mahmoud; El Sarraj, Eyad; Mallouh, Mohammed Abu

    2015-06-01

    Political imprisonment is a traumatic event, often accompanied by torture and deprivation. This study explores the association of political imprisonment between 1987 and 2011 with political, economic, community, psychological, physical, and family functioning in a population-based sample of Palestinian men ages 32-43 years (N = 884) derived from a dataset collected in 2011. Twenty-six percent (n = 233) had been politically imprisoned. Men imprisoned between 1987 and 2005 reported functioning as well as never-imprisoned men in most domains, suggesting that men imprisoned as youth have moved forward with their lives in ways similar to their nonimprisoned counterparts. In an exception to this pattern, men imprisoned during the Oslo Accords period (1994-1999) reported higher levels of trauma-related stress (B = 0.24, p = .027) compared to never-imprisoned men. Men imprisoned since 2006 reported lower functioning in multiple domains: human insecurity (B = 0.33, p = .023), freedom of public expression (B = -0.48, p = .017), perceived government stability (B = -0.38, p = .009), feeling broken or destroyed (B = 0.59, p = .001), physical limitations (B = 0.55, p = .002), and community belonging (B = -0.33, p = .048). Findings pointed to the value of examining the effects of imprisonment on functioning in multiple domains.

  2. Specific deterrence, community context, and drunk driving: an event history analysis.

    PubMed

    Lee, Chang-Bae; Teske, Raymond H C

    2015-03-01

    Previous studies about recidivism of offenders have focused primarily on the nature of the sanctions and factors specific to the individual offender. This study addressed both individual and community factors, using a cohort of felony-level, driving while intoxicated (DWI) probationers (N = 370) charged in Harris County, Texas. The study investigated specific deterrent effects of sanctions on success or failure of probationers while controlling for the community contexts to observe how informal social control processes contextualize individual-level predictors. Results of a series of event history analyses tracking probationers for a period of 8 years indicated that severity of punishment, swiftness of punishment, criminal history, and completion of DWI education programs significantly affected the probationer's survival time, whereas no significant influence of community contexts on survival time or success was observed. Reducing the felony charge to a misdemeanor, a shorter period of probation, and past criminal history, combined with an almost immediate guilty plea, were significantly associated with short-term failure on probation.

  3. Optimization of Magnet Strength for Event Reconstruction and Analysis at FNAL SeaQuest

    NASA Astrophysics Data System (ADS)

    Carstens, Paul; SeaQuest Collaboration

    2016-09-01

    The Fermilab E906/SeaQuest experiment primarily means to study the nucleon sea and its antiquark distribution. This experiment collides a 120 GeV proton beam with one of several fixed targets. E906/SeaQuest probes the quark sea with the Drell-Yan process in which a quark from the beam annihilates an antiquark from the target producing a virtual photon that decays into a pair of muons. Two magnets focus the muons through four detector stations in the spectrometer. The first is a solid iron magnet, which also serves as the beam dump and absorber. The second, an open aperture magnet, is the momentum analyzing magnet and is positioned between the first two detector stations. A tracking program reconstructs the trajectories of the particles in the detector to discern their kinematics. In order to correctly analyze data, the magnetic field strength must be accurately known since it affects the momentum of particles passing through the field. This poster focuses on how the magnet's effect on the transverse momentum of the muons affects kinematic reconstruction of both simulated and real events. This research was supported by US DOE MENP Grant DE-FG02-03ER41243 be added to my submission.

  4. Pan-cancer transcriptomic analysis associates long non-coding RNAs with key mutational driver events

    PubMed Central

    Ashouri, Arghavan; Sayin, Volkan I.; Van den Eynden, Jimmy; Singh, Simranjit X.; Papagiannakopoulos, Thales; Larsson, Erik

    2016-01-01

    Thousands of long non-coding RNAs (lncRNAs) lie interspersed with coding genes across the genome, and a small subset has been implicated as downstream effectors in oncogenic pathways. Here we make use of transcriptome and exome sequencing data from thousands of tumours across 19 cancer types, to identify lncRNAs that are induced or repressed in relation to somatic mutations in key oncogenic driver genes. Our screen confirms known coding and non-coding effectors and also associates many new lncRNAs to relevant pathways. The associations are often highly reproducible across cancer types, and while many lncRNAs are co-expressed with their protein-coding hosts or neighbours, some are intergenic and independent. We highlight lncRNAs with possible functions downstream of the tumour suppressor TP53 and the master antioxidant transcription factor NFE2L2. Our study provides a comprehensive overview of lncRNA transcriptional alterations in relation to key driver mutational events in human cancers.

  5. Joint Ne/O and Fe/O Analysis to Diagnose Large Solar Energetic Particle Events during Solar Cycle 23

    NASA Astrophysics Data System (ADS)

    Tan, Lun C.; Malandraki, Olga E.; Shao, Xi

    2017-02-01

    We have examined 29 large solar energetic particle (SEP) events with the peak proton intensity Jpp(>60 MeV) > 1 pfu during solar cycle 23. The emphasis of our examination is put on a joint analysis of Ne/O and Fe/O data in the energy range (3–40 MeV nucleon‑1) covered by Wind/Low-Energy Matrix Telescope and ACE/Solar Isotope Spectrometer sensors in order to differentiate between the Fe-poor and Fe-rich events that emerged from the coronal mass ejection driven shock acceleration process. An improved ion ratio calculation is carried out by rebinning ion intensity data into the form of equal bin widths in the logarithmic energy scale. Through the analysis we find that the variability of Ne/O and Fe/O ratios can be used to investigate the accelerating shock properties. In particular, the high-energy Ne/O ratio is well correlated with the source plasma temperature of SEPs.

  6. Statistical and Ontological Analysis of Adverse Events Associated with Monovalent and Combination Vaccines against Hepatitis A and B Diseases

    PubMed Central

    Xie, Jiangan; Zhao, Lili; Zhou, Shangbo; He, Yongqun

    2016-01-01

    Vaccinations often induce various adverse events (AEs), and sometimes serious AEs (SAEs). While many vaccines are used in combination, the effects of vaccine-vaccine interactions (VVIs) on vaccine AEs are rarely studied. In this study, AE profiles induced by hepatitis A vaccine (Havrix), hepatitis B vaccine (Engerix-B), and hepatitis A and B combination vaccine (Twinrix) were studied using the VAERS data. From May 2001 to January 2015, VAERS recorded 941, 3,885, and 1,624 AE case reports where patients aged at least 18 years old were vaccinated with only Havrix, Engerix-B, and Twinrix, respectively. Using these data, our statistical analysis identified 46, 69, and 82 AEs significantly associated with Havrix, Engerix-B, and Twinrix, respectively. Based on the Ontology of Adverse Events (OAE) hierarchical classification, these AEs were enriched in the AEs related to behavioral and neurological conditions, immune system, and investigation results. Twenty-nine AEs were classified as SAEs and mainly related to immune conditions. Using a logistic regression model accompanied with MCMC sampling, 13 AEs (e.g., hepatosplenomegaly) were identified to result from VVI synergistic effects. Classifications of these 13 AEs using OAE and MedDRA hierarchies confirmed the advantages of the OAE-based method over MedDRA in AE term hierarchical analysis. PMID:27694888

  7. Statistical and Ontological Analysis of Adverse Events Associated with Monovalent and Combination Vaccines against Hepatitis A and B Diseases.

    PubMed

    Xie, Jiangan; Zhao, Lili; Zhou, Shangbo; He, Yongqun

    2016-10-03

    Vaccinations often induce various adverse events (AEs), and sometimes serious AEs (SAEs). While many vaccines are used in combination, the effects of vaccine-vaccine interactions (VVIs) on vaccine AEs are rarely studied. In this study, AE profiles induced by hepatitis A vaccine (Havrix), hepatitis B vaccine (Engerix-B), and hepatitis A and B combination vaccine (Twinrix) were studied using the VAERS data. From May 2001 to January 2015, VAERS recorded 941, 3,885, and 1,624 AE case reports where patients aged at least 18 years old were vaccinated with only Havrix, Engerix-B, and Twinrix, respectively. Using these data, our statistical analysis identified 46, 69, and 82 AEs significantly associated with Havrix, Engerix-B, and Twinrix, respectively. Based on the Ontology of Adverse Events (OAE) hierarchical classification, these AEs were enriched in the AEs related to behavioral and neurological conditions, immune system, and investigation results. Twenty-nine AEs were classified as SAEs and mainly related to immune conditions. Using a logistic regression model accompanied with MCMC sampling, 13 AEs (e.g., hepatosplenomegaly) were identified to result from VVI synergistic effects. Classifications of these 13 AEs using OAE and MedDRA hierarchies confirmed the advantages of the OAE-based method over MedDRA in AE term hierarchical analysis.

  8. Synoptic-mesoscale analysis and numerical modeling of a tornado event on 12 February 2010 in northern Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, J. T.; Nastos, P. T.; Pytharoulis, I.

    2010-09-01

    Tornadoes are furious convective weather phenomena, having increased frequency, particularly in the cool season, attributed to the higher moisture content of the atmosphere due to global warming. Tornadoes' source regions are more likely shallow waters, which are easily warmed, such as Gulf of Mexico or Mediterranean Sea. This study analyzes the tornado event, that occurred on 12 February 2010 in Vrastera, Chalkidiki, a non urban area 45 km southeastern of Thessaloniki in northern Greece. The tornado was developed approximately between 17:15 and 18:45 UTC and characterized as F2 (Fujita Scale). The tornado caused several damages to an industrial building and an olive-tree farm. A synoptic analysis based on the ECMWF charts is presented along with an extended dataset of satellite images, radar products and vertical profile of the atmosphere. Additionaly, the nonhydrostatic WRF-ARW atmospheric numerical model (version 3.2) was utilized in analysis and forecast mode using very high horizontal resolution (1 km x 1 km) in order to represent the ambient atmospheric conditions and examine the prediction of the event. Sensitivity experiments look into the model performance in the choice of microphysical and boundary layer parameterization schemes.

  9. Image Processing, Computer Vision, and Deep Learning: new approaches to the analysis and physics interpretation of LHC events

    NASA Astrophysics Data System (ADS)

    Schwartzman, A.; Kagan, M.; Mackey, L.; Nachman, B.; De Oliveira, L.

    2016-10-01

    This review introduces recent developments in the application of image processing, computer vision, and deep neural networks to the analysis and interpretation of particle collision events at the Large Hadron Collider (LHC). The link between LHC data analysis and computer vision techniques relies on the concept of jet-images, building on the notion of a particle physics detector as a digital camera and the particles it measures as images. We show that state-of-the-art image classification techniques based on deep neural network architectures significantly improve the identification of highly boosted electroweak particles with respect to existing methods. Furthermore, we introduce new methods to visualize and interpret the high level features learned by deep neural networks that provide discrimination beyond physics- derived variables, adding a new capability to understand physics and to design more powerful classification methods at the LHC.

  10. A Simple Engineering Analysis of Solar Particle Event High Energy Tails and Their Impact on Vehicle Design

    NASA Technical Reports Server (NTRS)

    Singleterry, Robert C., Jr.; Walker, Steven A.; Clowdsley, Martha S.

    2016-01-01

    The mathematical models for Solar Particle Event (SPE) high energy tails are constructed with several di erent algorithms. Since limited measured data exist above energies around 400 MeV, this paper arbitrarily de nes the high energy tail as any proton with an energy above 400 MeV. In order to better understand the importance of accurately modeling the high energy tail for SPE spectra, the contribution to astronaut whole body e ective dose equivalent of the high energy portions of three di erent SPE models has been evaluated. To ensure completeness of this analysis, simple and complex geometries were used. This analysis showed that the high energy tail of certain SPEs can be relevant to astronaut exposure and hence safety. Therefore, models of high energy tails for SPEs should be well analyzed and based on data if possible.

  11. [Extraction of single-trial event-related potentials by means of ARX modeling and independent component analysis].

    PubMed

    Wang, Rongchang; Du, Sidan

    2006-12-01

    The present paper focused on the extraction of event-related potentials on a single sweep under extremely low S/N ratio. Two methods that can efficiently remove spontaneous EEG, ocular artifacts and power line interference were presented based on ARX modeling and independent component analysis (ICA). The former method applied ARX model to the measured compound signal that extensively contained the three kinds of ordinary noises mentioned above, and used ARX algorithm for parametric identification. The latter decomposed the signal by means of independent component analysis. Besides, some of ICA's important decomposing characters and its intrinsic causality were pointed out definitely. According to the practical situation, some modification on FastICA algorithm was also given, so as to implement auto-adaptive mapping of decomposed results to ERP component. Through simulation, both the two ways are proved to be highly capable of signal extraction and S/N ratio improving.

  12. Social Network Changes and Life Events across the Life Span: A Meta-Analysis

    ERIC Educational Resources Information Center

    Wrzus, Cornelia; Hanel, Martha; Wagner, Jenny; Neyer, Franz J.

    2013-01-01

    For researchers and practitioners interested in social relationships, the question remains as to how large social networks typically are, and how their size and composition change across adulthood. On the basis of predictions of socioemotional selectivity theory and social convoy theory, we conducted a meta-analysis on age-related social network…

  13. A Statistical Analysis of Infrequent Events on Multiple-Choice Tests that Indicate Probable Cheating

    ERIC Educational Resources Information Center

    Sundermann, Michael J.

    2008-01-01

    A statistical analysis of multiple-choice answers is performed to identify anomalies that can be used as evidence of student cheating. The ratio of exact errors in common (EEIC: two students put the same wrong answer for a question) to differences (D: two students get different answers) was found to be a good indicator of cheating under a wide…

  14. Cure models for the analysis of time-to-event data in cancer studies.

    PubMed

    Jia, Xiaoyu; Sima, Camelia S; Brennan, Murray F; Panageas, Katherine S

    2013-11-01

    In settings when it is biologically plausible that some patients are cured after definitive treatment, cure models present an alternative to conventional survival analysis. Cure models can inform on the group of patients cured, by estimating the probability of cure, and identifying factors that influence it; while simultaneously focusing on time to recurrence and associated factors for the remaining patients.

  15. Rare-event Analysis and Computational Methods for Stochastic Systems Driven by Random Fields

    DTIC Science & Technology

    2014-12-29

    dynamics, neuroscience, fiber optics, astronomy , further civil engineering, engineer design, ocean-earth sciences, and so forth. We perform risk analysis...systems in areas as diverse as material science, fluid dynamics, neuroscience, fiber optics, astronomy , further civil engineering, engineer design

  16. Naturalizing the Future in Factual Discourse: A Critical Linguistic Analysis of a Projected Event.

    ERIC Educational Resources Information Center

    Dunmire, Patricia L.

    1997-01-01

    Examines the linguistic processes through which a projected effect is constructed within factual discourse. Applies critical linguistic analysis to coverage of the 1990 Gulf War in the "New York Times" and "Washington Post." Expands on work in critical linguistics and demonstrates how political interests underlying newspaper…

  17. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care

    PubMed Central

    McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Introduction: Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested “guiding tools” based on human factors principles. Methods: Mixed-methods development of guiding tools (Personal Booklet—to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad—to guide a team-based systems analysis; and a written Report Format) by a multiprofessional “expert” group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Results: Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P < .001), while most agreed that the Personal Booklet was practical (88/123, 71.5%) and relevant to dealing with related emotions (93/123, 75.6%). The Desk Pad tool helped focus the SEA on systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Discussion: Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement. PMID:27583996

  18. Analysis of strong rainfall events at Cividale del Friuli (Northeastern Italy) from 1920 to 2007

    NASA Astrophysics Data System (ADS)

    Colucci, R. R.; Cucchi, F.; Stravisi, F.; Zini, L.

    2009-04-01

    Strong rainfall events have been selected from a detailed statistical investigation of the rainfall regime recorded at Cividale del Friuli (which is a town situated in the piedmont area of the Italian Prealpi Giulie), by applying suitable thresholds on the set of monthly and daily rainfall. On the basis of these data the following aspects as been explored: Time evolution of the location of the most rainy month, for each year, in the meteorological seasons; Return period, in years, of the maximum monthly precipitation, for each year; Return period, in years, of the maximum daily precipitation, for each year; Return period, in years, of the maximum precipitation during two consecutive days, for each year; Trend of the series of rainy days above a given thresholds, for different thresholds; Long term evolution of the infra-annual precipitation by averaging four 21-years cycles of rainfall for the periods 1924-44, 1945-65, 1966-86, 1987-2007; Comparison of the above rainfall series with that of Trieste which is a coastal town situated at the north-eastern most part of the Adriatic sea, about 70 km far from Cividale del Friuli (which is usually much more rainy than Trieste). Unlike the almost constant linear trend for Trieste, that of Cividale del Friuli is markedly decreasing in the whole period 1920-2007. This phenomenon is ascribed to a recent climatic change in the mesoscale wind regime in which winds from NW-N-NE are more frequent than those zonal. Therefore in the areas more close to the mountains a lesser and lesser quantity of rain is recorded relatively to the areas far enough from the mountains like Trieste. The different behaviour is more evident during spring and summer which are the seasons with the higher occurrence of thunderstorms. Winter period shows a different behaviour between the years before and after 1980; in the last 30 years the decreasing of the differences has been very pronounced. Further conclusions are the following: Total yearly rainfalls

  19. Analysis of contributing factors influencing thromboembolic events after total knee arthroplasty

    PubMed Central

    Plante, Sylvie; Belzile, Etienne L.; Fréchette, Dominique; Lefebvre, Jean

    2017-01-01

    Background Venous thromboembolic events (VTE) are a known and well-described complication following total knee arthroplasty (TKA). We sought to validate the American College of Chest Physicians thromboprophylaxis recommendations after elective TKA, paying special attention to our dose adjustments for weight, and their impact on VTE in our population. Methods We retrospectively investigated risk factors in patients undergoing TKA, focusing mainly on symptomatic VTE occurrence rates from deep vein thrombosis (DVT) or pulmonary embolism (PE). The anticoagulation protocol consisted of starting low molecular weight heparin (LMWH) therapy, with dalteparin administered 12 h after surgery in patients who received general anesthesia or 24 h later in patients who received single-dose regional anesthesia. Results Data from 346 patients (mean age 66.8 [range 24–91] yr) who underwent primary or revision TKA depicted an overall symptomatic VTE rate of 15%. The proximal DVT rate was 1.7%, and the nonfatal PE rate was 0.9%. The mean time to VTE diagnosis was 5.6 days. The first dalteparin dose was administered 19.5 (range 10–48) h after surgery in patients without VTE and 22.6 (range 11.5–52) h after surgery in patients with VTE (p = 0.003). With a first dose of dalteparin administered 12 h postoperatively, patients presented significantly lower DVT and PE rates than if it was administered 24 h postoperatively (8.5% v. 16.3%, p = 0.048). Conclusion Delayed administration of LMWH has deleteriously impacted the VTE rate after TKA at our institution. Prompt initiation of LMWH (≤ 12 h after surgery) is appropriate, without increasing the risk of major bleeding. PMID:28234587

  20. Attention effects on auditory scene analysis: insights from event-related brain potentials.

    PubMed

    Spielmann, Mona Isabel; Schröger, Erich; Kotz, Sonja A; Bendixen, Alexandra

    2014-01-01

    Sounds emitted by different sources arrive at our ears as a mixture that must be disentangled before meaningful information can be retrieved. It is still a matter of debate whether this decomposition happens automatically or requires the listener's attention. These opposite positions partly stem from different methodological approaches to the problem. We propose an integrative approach that combines the logic of previous measurements targeting either auditory stream segregation (interpreting a mixture as coming from two separate sources) or integration (interpreting a mixture as originating from only one source). By means of combined behavioral and event-related potential (ERP) measures, our paradigm has the potential to measure stream segregation and integration at the same time, providing the opportunity to obtain positive evidence of either one. This reduces the reliance on zero findings (i.e., the occurrence of stream integration in a given condition can be demonstrated directly, rather than indirectly based on the absence of empirical evidence for stream segregation, and vice versa). With this two-way approach, we systematically manipulate attention devoted to the auditory stimuli (by varying their task relevance) and to their underlying structure (by delivering perceptual tasks that require segregated or integrated percepts). ERP results based on the mismatch negativity (MMN) show no evidence for a modulation of stream integration by attention, while stream segregation results were less clear due to overlapping attention-related components in the MMN latency range. We suggest future studies combining the proposed two-way approach with some improvements in the ERP measurement of sequential stream segregation.

  1. Statistical analysis of diffuse ion events upstream of the Earth's bow shock

    NASA Technical Reports Server (NTRS)

    Trattner, K. J.; Mobius, E.; Scholer, M.; Klecker, B.; Hilchenbach, M.; Luehr, H.

    1994-01-01

    A statistical study of diffuse energetic ion events and their related waves upstream of the Earth's bow shock was performed using data from the Active Magnetospheric Particle Tracer Explorers/Ion Release Module (AMPTE/IRM) satellite over two 5-month periods in 1984 and 1985. The data set was used to test the assumption in the self-consistent model of the upstream wave and particle populations by Lee (1982) that the particle acceleration through hydromagnetic waves and the wave generation are directly coupled. The comparison between the observed wave power and the wave power predicted on the observed energetic particle energy density and solar wind parameters results in a high correlation coefficient of about 0.89. The intensity of diffuse ions falls off approximately exponentially with the distance upstream from the bow shock parallel to the magnetic field with e-folding distances which vary from approximately 3.3 R(sub E) to approximately 11.7 R(sub E) over the energy range from 10 keV/e to 67.3 keV/e for both protons and alpha particles. After normalizing the upstream particle densities to zero bow shock distance by using these exponential variations, a good correlation (0.7) of the density of the diffuse ions with the solar wind density was found. This supports the suggestion that the solar wind is the source of the diffuse ions. Furthermore, the spectral slope of the diffuse ions correlates well with the solar wind velocity component in the direction of the interplanetary magnetic field (0.68 and 0.66 for protons and alpha particles) which concurs with the notion that the solar wind plays an important role in the acceleration of the upstream particles.

  2. Analysis of STS-134 Hail Event at Pad 39A, March 30, 2011

    NASA Technical Reports Server (NTRS)

    Lane, John E.

    2011-01-01

    During the late afternoon of March 30, 2011 at approximately 21:25 - 21:30 GMT, hail monitor stations at Pad 39A recorded rice to pea size hail. The duration of the event was approximately 5 minutes. The maximum size detected by the three hail monitors was 10 - 12 mm. The 12 mm marble size value was measured by the active impact sensor at site #2, which experienced high winds. This 12 mm measurement may be artificially higher by one or two mm due to the extra hail kinetic energy resulting from the extreme horizontal winds. High winds from the west produced a few notable long streak-like dents in the hail pads. High winds were also responsible for damage to facilities near hail monitor site #2 on the west side of pad A (a dumpster was overturned, and a picnic table roof was demolished). NWS radar volume scan (see Figure I) showed 60-65 dBZ reflectivity values in the lowest 4 scan elevations around and over the pad 39A area. Since the lowest 0.5 degree scan showed a definite 65 dBZ signature, it is unlikely that hail had an opportunity to melt before reaching the ground. Some of the larger passive hail pad dents were shallower than what would be expected from solid frozen ice hydrometeor dents. Therefore, it is possible that the larger pea size hail may have been softer than the smaller rice size hail. This would be consistent with some melting before reaching the ground.

  3. Dynamical Analysis of Blocking Events: Spatial and Temporal Fluctuations of Covariant Lyapunov Vectors

    NASA Astrophysics Data System (ADS)

    Schubert, Sebastian; Lucarini, Valerio

    2016-04-01

    One of the most relevant weather regimes in the mid latitudes atmosphere is the persistent deviation from the approximately zonally symmetric jet stream to the emergence of so-called blocking patterns. Such configurations are usually connected to exceptional local stability properties of the flow which come along with an improved local forecast skills during the phenomenon. It is instead extremely hard to predict onset and decay of blockings. Covariant Lyapunov Vectors (CLVs) offer a suitable characterization of the linear stability of a chaotic flow, since they represent the full tangent linear dynamics by a covariant basis which explores linear perturbations at all time scales. Therefore, we will test whether CLVs feature a signature of the blockings. We examine the CLVs for a quasi-geostrophic beta-plane two-layer model in a periodic channel baroclinically driven by a meridional temperature gradient ΔT. An orographic forcing enhances the emergence of localized blocked regimes. We detect the blocking events of the channel flow with a Tibaldi-Molteni scheme adapted to the periodic channel. When blocking occurs, the global growth rates of the fastest growing CLVs are significantly higher. Hence against intuition, globally the circulation is more unstable in blocked phases. Such an increase in the finite time Lyapunov exponents with respect to the long term average is attributed to stronger barotropic and baroclinic conversion in the case of high temperature gradients, while for low values of ΔT, the effect is only due to stronger barotropic instability. For the localization of the CLVs, we compare the meridionally averaged variance of the CLVs during blocked and unblocked phases. We find that on average the variance of the CLVs is clustered around the center of blocking. These results show that the blocked flow affects all time scales and processes described by the CLVs.

  4. Subjective Well-Being and Adaptation to Life Events: A Meta-Analysis on Differences Between Cognitive and Affective Well-Being

    PubMed Central

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E.

    2012-01-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on cognitive and affective well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to four family events (marriage, divorce, bereavement, child birth) and four work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being, and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843

  5. Proteomic analysis of the Cyanophora paradoxa muroplast provides clues on early events in plastid endosymbiosis.

    PubMed

    Facchinelli, Fabio; Pribil, Mathias; Oster, Ulrike; Ebert, Nina J; Bhattacharya, Debashish; Leister, Dario; Weber, Andreas P M

    2013-02-01

    Glaucophytes represent the first lineage of photosynthetic eukaryotes of primary endosymbiotic origin that diverged after plastid establishment. The muroplast of Cyanophora paradoxa represents a primitive plastid that resembles its cyanobacterial ancestor in pigment composition and the presence of a peptidoglycan wall. To attain insights into the evolutionary history of cyanobiont integration and plastid development, it would thus be highly desirable to obtain knowledge on the composition of the glaucophyte plastid proteome. Here, we provide the first proteomic analysis of the muroplast of C. paradoxa. Mass spectrometric analysis of the muroplast proteome identified 510 proteins with high confidence. The protein repertoire of the muroplast revealed novel paths for reduced carbon flow and export to the cytosol through a sugar phosphate transporter of chlamydial origin. We propose that C. paradoxa possesses a primordial plastid mirroring the situation in the early protoalga.

  6. Analysis, Estimation, and Control for Perturbed and Singular Systems and for Systems Subject to Discrete Events

    DTIC Science & Technology

    1988-10-01

    linear systems. Opportunities for studying problems of this sort arise in the context of electrical machines 6 and power electronics. Deferring a...discussion of modeling and analysis of switched power electronic circuits to Section IV, we mention here some recent work of ours in stability studies for... electrical machines. The most commonly considered nominal operating condition for the nonlinear model of an electrical machine system is constant speed

  7. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  8. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  9. Study of the tornado event in Greece on March 25, 2009: Synoptic analysis and numerical modeling using modified topography

    NASA Astrophysics Data System (ADS)

    Matsangouras, I. T.; Nastos, P. T.; Pytharoulis, I.

    2016-03-01

    Recent research revealed that western Greece and NW Peloponnese are regions that favor prefrontal tornadic incidence. On March 25, 2009 a tornado developed approximately at 10:30 UTC near Varda village (NW Peloponnese). Tornado intensity was T4-T5 (TORRO scale) and consequently caused an economic impact of 350,000 € over the local society. The goals of this study are: (i) to analyze synoptic and remote sensing features regarding the tornado event over NW Peloponnese and (ii) to investigate the role of topography in tornadogenesis triggered under strong synoptic scale forcing over that area. Synoptic analysis was based on the European Centre for Medium-Range Weather Forecasts (ECMWF) data sets. The analysis of daily anomaly of synoptic conditions with respect to 30 years' climatology (1981-2010), was based on the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) reanalysis data sets. In addition, numerous remote sensing data sets were derived by the Hellenic National Meteorological Service (HNMS) weather station network in order to better interpret the examined tornado event. Finally, numerical modeling was performed using the non-hydrostatic Weather Research and Forecasting model (WRF), initialized by ECMWF gridded analyses, with telescoping nested grids that allow the representation of atmospheric circulations ranging from the synoptic scale down to the meso-scale. The two numerical experiments were performed on the basis of: (a) the presence and (b) the absence of topography (landscape), so as to determine whether the occurrence of a tornado - identified by diagno