Sample records for key events analysis

  1. Toxicogenomics and cancer risk assessment: a framework for key event analysis and dose-response assessment for nongenotoxic carcinogens.

    PubMed

    Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L

    2010-12-01

    In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.

  2. Decision Trajectories in Dementia Care Networks: Decisions and Related Key Events.

    PubMed

    Groen-van de Ven, Leontine; Smits, Carolien; Oldewarris, Karen; Span, Marijke; Jukema, Jan; Eefsting, Jan; Vernooij-Dassen, Myrra

    2017-10-01

    This prospective multiperspective study provides insight into the decision trajectories of people with dementia by studying the decisions made and related key events. This study includes three waves of interviews, conducted between July 2010 and July 2012, with 113 purposefully selected respondents (people with beginning to advanced stages of dementia and their informal and professional caregivers) completed in 12 months (285 interviews). Our multilayered qualitative analysis consists of content analysis, timeline methods, and constant comparison. Four decision themes emerged-managing daily life, arranging support, community living, and preparing for the future. Eight key events delineate the decision trajectories of people with dementia. Decisions and key events differ between people with dementia living alone and living with a caregiver. Our study clarifies that decisions relate not only to the disease but to living with the dementia. Individual differences in decision content and sequence may effect shared decision-making and advance care planning.

  3. Predicting Key Events in the Popularity Evolution of Online Information.

    PubMed

    Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  4. Predicting Key Events in the Popularity Evolution of Online Information

    PubMed Central

    Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the “burst”, “peak”, and “fade” key events together as a representative summary of popularity evolution. We propose a novel prediction task—predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify “burst”, “peak”, and “fade” in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution. PMID:28046121

  5. Discrete Event Simulation Modeling and Analysis of Key Leader Engagements

    DTIC Science & Technology

    2012-06-01

    to offer. GreenPlayer agents require four parameters, pC, pKLK, pTK, and pRK , which give probabilities for being corrupt, having key leader...HandleMessageRequest component. The same parameter constraints apply to these four parameters. The parameter pRK is the same parameter from the CreatePlayers component...whether the local Green player has resource critical knowledge by using the parameter pRK . It schedules an EndResourceKnowledgeRequest event, passing

  6. Design of virtual simulation experiment based on key events

    NASA Astrophysics Data System (ADS)

    Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu

    2018-06-01

    Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.

  7. Negated bio-events: analysis and identification

    PubMed Central

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  8. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  9. Timing and documentation of key events in neonatal resuscitation.

    PubMed

    Heathcote, Adam Charles; Jones, Jacqueline; Clarke, Paul

    2018-04-30

    Only a minority of babies require extended resuscitation at birth. Resuscitations concerning babies who die or who survive with adverse outcomes are increasingly subject to medicolegal scrutiny. Our aim was to describe real-life timings of key resuscitation events observed in a historical series of newborns who required full resuscitation at birth. Twenty-seven babies born in our centre over a 10-year period had an Apgar score of 0 at 1 min and required full resuscitation. The median (95% confidence interval) postnatal age at achieving key events were commencing cardiac compressions, 2.0 (1.5-4.0) min; endotracheal intubation, 3.8 (2.0-6.0) min; umbilical venous catheterisation 9.0 (7.5-12.0) min; and administration of first adrenaline dose 10.0 (8.0-14.0) min. The wide range of timings presented from real-life cases may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training. What is Known: • Only a minority of babies require extended resuscitation at birth; these cases are often subject to medicolegal interrogation • Timings of key resuscitation events are poorly described and documentation of resuscitation events is often lacking yet is open to medicolegal scrutiny What is New: • We present a wide range of real-life timings of key resuscitation events during the era of routine newborn life support training • These timings may prove useful to clinicians involved in medical negligence claims and provide a baseline for quality improvements in resuscitation training.

  10. Genetic Stratigraphy of Key Demographic Events in Arabia

    PubMed Central

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B.; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (∼8–37 generations for African input into Arabia, and 30–90 generations for “back-to-Africa” migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ∼2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ∼8 ka. The main “back-to-Africa” migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa. PMID:25738654

  11. Genetic stratigraphy of key demographic events in Arabia.

    PubMed

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro; Richards, Martin B; Pereira, Luísa

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (~8-37 generations for African input into Arabia, and 30-90 generations for "back-to-Africa" migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ~2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ~8 ka. The main "back-to-Africa" migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa.

  12. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  13. Key Events in Student Leaders' Lives and Lessons Learned from Them

    ERIC Educational Resources Information Center

    Sessa, Valerie I.; Morgan, Brett V.; Kalenderli, Selin; Hammond, Fanny E.

    2014-01-01

    This descriptive study used an interview protocol developed by the Center for Creative Leadership with 50 college student leaders to determine what key developmental events young college leaders experience and the leadership lessons learned from these events. Students discussed 180 events and 734 lessons learned from them. Most events defined by…

  14. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    NASA Astrophysics Data System (ADS)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  15. Pan-cancer transcriptomic analysis associates long non-coding RNAs with key mutational driver events

    PubMed Central

    Ashouri, Arghavan; Sayin, Volkan I.; Van den Eynden, Jimmy; Singh, Simranjit X.; Papagiannakopoulos, Thales; Larsson, Erik

    2016-01-01

    Thousands of long non-coding RNAs (lncRNAs) lie interspersed with coding genes across the genome, and a small subset has been implicated as downstream effectors in oncogenic pathways. Here we make use of transcriptome and exome sequencing data from thousands of tumours across 19 cancer types, to identify lncRNAs that are induced or repressed in relation to somatic mutations in key oncogenic driver genes. Our screen confirms known coding and non-coding effectors and also associates many new lncRNAs to relevant pathways. The associations are often highly reproducible across cancer types, and while many lncRNAs are co-expressed with their protein-coding hosts or neighbours, some are intergenic and independent. We highlight lncRNAs with possible functions downstream of the tumour suppressor TP53 and the master antioxidant transcription factor NFE2L2. Our study provides a comprehensive overview of lncRNA transcriptional alterations in relation to key driver mutational events in human cancers. PMID:28959951

  16. Platelet activation is a key event in the pathogenesis of streptococcal infections.

    PubMed

    Jia, Ming; Xiong, Yuling; Lu, Hua; Li, Ruqing; Wang, Tiantian; Ye, Yanyao; Song, Min; Li, Bing; Jiang, Tianlun; Zhao, Shuming

    2015-06-01

    Diverse Streptococcus species including Streptococcus Pneumoniae, Sanguis, Gordonii, Mitis and Mutans cause life-threatening conditions including pneumonia, bacteremia and meningitis. These diseases bear a high morbidity and mortality and for this reason, understanding the key events in the pathogenesis of these infections have a great significance in their prevention and/or treatment. Here, we describe as how the activation of the platelets and their affinity to bind to bacterial proteins act as early key events in the pathogenesis of Streptococcal infections.

  17. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  18. Preparedness of newly qualified midwives to deliver clinical care: an evaluation of pre-registration midwifery education through an analysis of key events.

    PubMed

    Skirton, Heather; Stephen, Nicole; Doris, Faye; Cooper, Maggie; Avis, Mark; Fraser, Diane M

    2012-10-01

    this study was part of a larger project commissioned to ascertain whether midwife teachers bring a unique contribution to the preparation of midwives for practice. The aim of this phase was to determine whether the student midwives' educational programme had equipped them to practise competently after entry to the professional register. this was a prospective, longitudinal qualitative study, using participant diaries to collect data. data were collected from newly qualified midwives during the initial six months after they commenced their first post as a qualified midwife. the potential participants were all student midwives who were completing their education at one of six Universities (three in England, one in Scotland, one in Wales and one in Northern Ireland). Diary data were submitted by 35 newly qualified midwives; 28 were graduates of the three year programme and seven of the shortened programme. diary entries were analysed using thematic analysis (Braun and Clarke, 2006), with a focus on identification of key events in the working lives of the newly qualified midwives. A total of 263 key events were identified, under three main themes: (1) impact of the event on confidence, (2) gaps in knowledge or experience and (3) articulated frustration, conflict or distress. essentially, pre-registration education, delivered largely by midwife teachers and supported by clinical mentors, has been shown to equip newly qualified midwives to work effectively as autonomous practitioners caring for mothers and babies. While newly qualified midwives are able to cope with a range of challenging clinical situations in a safe manner, they lack confidence in key areas. Positive reinforcement by supportive colleagues plays a significant role in enabling them to develop as practitioners. whilst acknowledging the importance of normality in childbearing there is a need within the curriculum to enable midwives to recognise and respond to complex care situations by providing theory

  19. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  20. Mining the key predictors for event outbreaks in social networks

    NASA Astrophysics Data System (ADS)

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  1. Markerless identification of key events in gait cycle using image flow.

    PubMed

    Vishnoi, Nalini; Duric, Zoran; Gerber, Naomi Lynn

    2012-01-01

    Gait analysis has been an interesting area of research for several decades. In this paper, we propose image-flow-based methods to compute the motion and velocities of different body segments automatically, using a single inexpensive video camera. We then identify and extract different events of the gait cycle (double-support, mid-swing, toe-off and heel-strike) from video images. Experiments were conducted in which four walking subjects were captured from the sagittal plane. Automatic segmentation was performed to isolate the moving body from the background. The head excursion and the shank motion were then computed to identify the key frames corresponding to different events in the gait cycle. Our approach does not require calibrated cameras or special markers to capture movement. We have also compared our method with the Optotrak 3D motion capture system and found our results in good agreement with the Optotrak results. The development of our method has potential use in the markerless and unencumbered video capture of human locomotion. Monitoring gait in homes and communities provides a useful application for the aged and the disabled. Our method could potentially be used as an assessment tool to determine gait symmetry or to establish the normal gait pattern of an individual.

  2. Artist concept illustrating key events on day by day basis during Apollo 9

    NASA Technical Reports Server (NTRS)

    1969-01-01

    Artist concept illustrating key events on day by day basis during Apollo 9 mission. First photograph illustrates activities on the first day of the mission, including flight crew preparation, orbital insertion, 103 north mile orbit, separations, docking and docked Service Propulsion System Burn (19792); Second day events include landmark tracking, pitch maneuver, yaw-roll maneuver, and high apogee orbits (19793); Third day events include crew transfer and Lunar Module system evaluation (19794); Fourth day events include use of camera, day-night extravehicular activity, use of golden slippers, and television over Texas and Louisiana (19795); Fifth day events include vehicles undocked, Lunar Module burns for rendezvous, maximum separation, ascent propulsion system burn, formation flying and docking, and Lunar Module jettison ascent burn (19796); Sixth thru ninth day events include service propulsion system burns and landmark sightings, photograph special tests (19797); Tenth day events i

  3. EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.

    PubMed

    Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan

    2018-01-01

    Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.

  4. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life

    PubMed Central

    Fredriksen-Goldsen, Karen I.; Bryan, Amanda E. B.; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-01-01

    Purpose of the Study: Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. Design and Methods: This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. Results: On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: “Retired Survivors” were the oldest and one of the most prevalent groups; “Midlife Bloomers” first disclosed their LGBT identities in mid-40s, on average; “Beleaguered At-Risk” had high rates of job-related discrimination and few social resources; and “Visibly Resourced” had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Implications: Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. PMID:28087792

  5. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  6. Fine-Scale Event Location and Error Analysis in NET-VISA

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2016-12-01

    NET-VISA is a generative probabilistic model for the occurrence of seismic, hydro, and atmospheric events, and the propagation of energy from these events through various mediums and phases before being detected, or misdetected, by IMS stations. It is built on top of the basic station, and arrival detection processing at the IDC, and is currently being tested in the IDC network processing pipelines. A key distinguishing feature of NET-VISA is that it is easy to incorporate prior scientific knowledge and historical data into the probabilistic model. The model accounts for both detections and mis-detections when forming events, and this allows it to make more accurate event hypothesis. It has been continuously evaluated since 2012, and in each year it makes a roughly 60% reduction in the number of missed events without increasing the false event rate as compared to the existing GA algorithm. More importantly the model finds large numbers of events that have been confirmed by regional seismic bulletins but missed by the IDC analysts using the same data. In this work we focus on enhancements to the model to improve the location accuracy, and error ellipses. We will present a new version of the model that focuses on the fine scale around the event location, and present error ellipses and analysis of recent important events.

  7. Dose and Effect Thresholds for Early Key Events in a Mode of PPARa-Mediated Action

    EPA Science Inventory

    ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. T...

  8. Dose and Effect Thresholds for Early Key Events in a Mode of ...

    EPA Pesticide Factsheets

    ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. The goal of this study was to evaluate short-term key event indicators using qualitative and quantitative methods in an established pathway of mouse liver tumorigenesis mediated by peroxisome proliferator-activated receptor-alpha (PPARα). Male B6C3F1 mice were exposed for 7 days to di(2-ethylhexyl) phthalate (DEHP), di-n-octyl phthalate (DNOP), and n-butyl benzyl phthalate (BBP), which vary in PPARα activity and liver tumorigenicity. Each phthalate increased expression of select PPARα target genes at 7 days, while only DEHP significantly increased liver cell proliferation labeling index (LI). Transcriptional benchmark dose (BMDT) estimates for dose-related genomic markers stratified phthalates according to hypothetical tumorigenic potencies, unlike BMDs for non-genomic endpoints (liver weights or proliferation). The 7-day BMDT values for Acot1 as a surrogate measure for PPARα activation were 29, 370, and 676 mg/kg-d for DEHP, DNOP, and BBP, respectively, distinguishing DEHP (liver tumor BMD of 35 mg/kg-d) from non-tumorigenic DNOP and BBP. Effect thresholds were generated using linear regression of DEHP effects at 7 days and 2-year tumor incidence values to anchor early response molec

  9. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life.

    PubMed

    Fredriksen-Goldsen, Karen I; Bryan, Amanda E B; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-02-01

    Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: "Retired Survivors" were the oldest and one of the most prevalent groups; "Midlife Bloomers" first disclosed their LGBT identities in mid-40s, on average; "Beleaguered At-Risk" had high rates of job-related discrimination and few social resources; and "Visibly Resourced" had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  11. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  12. 14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...

  13. 14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...

  14. 14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...

  15. 14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...

  16. 14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...

  17. Survival analysis: Part I — analysis of time-to-event

    PubMed Central

    2018-01-01

    Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911

  18. Twitter data analysis: temporal and term frequency analysis with real-time event

    NASA Astrophysics Data System (ADS)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  19. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  20. Evaluation of key events in the mode of action for a carry-over carcinogen in mice

    EPA Science Inventory

    Evaluation of key events in the mode of action for a carry-over carcinogen in mice Charles E. Wood, April D. Lake, Greg Olson, Michael H. George, Susan D. Hester, Anthony B. DeAngelo Introduction: Early life environmental exposures are established determinants for adverse health...

  1. Key terms for the assessment of the safety of vaccines in pregnancy: Results of a global consultative process to initiate harmonization of adverse event definitions.

    PubMed

    Munoz, Flor M; Eckert, Linda O; Katz, Mark A; Lambach, Philipp; Ortiz, Justin R; Bauwens, Jorgen; Bonhoeffer, Jan

    2015-11-25

    The variability of terms and definitions of Adverse Events Following Immunization (AEFI) represents a missed opportunity for optimal monitoring of safety of immunization in pregnancy. In 2014, the Brighton Collaboration Foundation and the World Health Organization (WHO) collaborated to address this gap. Two Brighton Collaboration interdisciplinary taskforces were formed. A landscape analysis included: (1) a systematic literature review of adverse event definitions used in vaccine studies during pregnancy; (2) a worldwide stakeholder survey of available terms and definitions; (3) and a series of taskforce meetings. Based on available evidence, taskforces proposed key terms and concept definitions to be refined, prioritized, and endorsed by a global expert consultation convened by WHO in Geneva, Switzerland in July 2014. Using pre-specified criteria, 45 maternal and 62 fetal/neonatal events were prioritized, and key terms and concept definitions were endorsed. In addition recommendations to further improve safety monitoring of immunization in pregnancy programs were specified. This includes elaboration of disease concepts into standardized case definitions with sufficient applicability and positive predictive value to be of use for monitoring the safety of immunization in pregnancy globally, as well as the development of guidance, tools, and datasets in support of a globally concerted approach. There is a need to improve the safety monitoring of immunization in pregnancy programs. A consensus list of terms and concept definitions of key events for monitoring immunization in pregnancy is available. Immediate actions to further strengthen monitoring of immunization in pregnancy programs are identified and recommended. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  3. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  4. Research on Visual Analysis Methods of Terrorism Events

    NASA Astrophysics Data System (ADS)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  5. Joint Attributes and Event Analysis for Multimedia Event Detection.

    PubMed

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  6. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    PubMed

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  7. Crop Damage by Primates: Quantifying the Key Parameters of Crop-Raiding Events

    PubMed Central

    Wallace, Graham E.; Hill, Catherine M.

    2012-01-01

    Human-wildlife conflict often arises from crop-raiding, and insights regarding which aspects of raiding events determine crop loss are essential when developing and evaluating deterrents. However, because accounts of crop-raiding behaviour are frequently indirect, these parameters are rarely quantified or explicitly linked to crop damage. Using systematic observations of the behaviour of non-human primates on farms in western Uganda, this research identifies number of individuals raiding and duration of raid as the primary parameters determining crop loss. Secondary factors include distance travelled onto farm, age composition of the raiding group, and whether raids are in series. Regression models accounted for greater proportions of variation in crop loss when increasingly crop and species specific. Parameter values varied across primate species, probably reflecting differences in raiding tactics or perceptions of risk, and thereby providing indices of how comfortable primates are on-farm. Median raiding-group sizes were markedly smaller than the typical sizes of social groups. The research suggests that key parameters of raiding events can be used to measure the behavioural impacts of deterrents to raiding. Furthermore, farmers will benefit most from methods that discourage raiding by multiple individuals, reduce the size of raiding groups, or decrease the amount of time primates are on-farm. This study demonstrates the importance of directly relating crop loss to the parameters of raiding events, using systematic observations of the behaviour of multiple primate species. PMID:23056378

  8. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  9. Key components of financial-analysis education for clinical nurses.

    PubMed

    Lim, Ji Young; Noh, Wonjung

    2015-09-01

    In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  10. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    PubMed

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  11. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis

    PubMed Central

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    Background This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Methodology/Principal Findings Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006–2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006–2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including “natural products and polymers” with nine key technical points, “fermentation industry” with twelve ones, “electrical medical equipment” with four ones, and “diagnosis, surgery” with four ones. Conclusions/Significance The results of this study could provide guidance on the development

  12. Protein analysis: key to the future.

    PubMed

    Boodhun, Nawsheen

    2018-05-01

    Protein analysis is crucial to elucidating the function of proteins and understanding the impact of their presence, absence and alteration. This is key to advancing knowledge about diseases, providing the opportunity for biomarker discovery and development of therapeutics. In this issue of Tech News, Nawsheen Boodhun explores the various means of protein analysis.

  13. Screening key candidate genes and pathways involved in insulinoma by microarray analysis.

    PubMed

    Zhou, Wuhua; Gong, Li; Li, Xuefeng; Wan, Yunyan; Wang, Xiangfei; Li, Huili; Jiang, Bin

    2018-06-01

    Insulinoma is a rare type tumor and its genetic features remain largely unknown. This study aimed to search for potential key genes and relevant enriched pathways of insulinoma.The gene expression data from GSE73338 were downloaded from Gene Expression Omnibus database. Differentially expressed genes (DEGs) were identified between insulinoma tissues and normal pancreas tissues, followed by pathway enrichment analysis, protein-protein interaction (PPI) network construction, and module analysis. The expressions of candidate key genes were validated by quantitative real-time polymerase chain reaction (RT-PCR) in insulinoma tissues.A total of 1632 DEGs were obtained, including 1117 upregulated genes and 514 downregulated genes. Pathway enrichment results showed that upregulated DEGs were significantly implicated in insulin secretion, and downregulated DEGs were mainly enriched in pancreatic secretion. PPI network analysis revealed 7 hub genes with degrees more than 10, including GCG (glucagon), GCGR (glucagon receptor), PLCB1 (phospholipase C, beta 1), CASR (calcium sensing receptor), F2R (coagulation factor II thrombin receptor), GRM1 (glutamate metabotropic receptor 1), and GRM5 (glutamate metabotropic receptor 5). DEGs involved in the significant modules were enriched in calcium signaling pathway, protein ubiquitination, and platelet degranulation. Quantitative RT-PCR data confirmed that the expression trends of these hub genes were similar to the results of bioinformatic analysis.The present study demonstrated that candidate DEGs and enriched pathways were the potential critical molecule events involved in the development of insulinoma, and these findings were useful for better understanding of insulinoma genesis.

  14. Root cause analysis of critical events in neurosurgery, New South Wales.

    PubMed

    Perotti, Vanessa; Sheridan, Mark M P

    2015-09-01

    Adverse events reportedly occur in 5% to 10% of health care episodes. Not all adverse events are the result of error; they may arise from systemic faults in the delivery of health care. Catastrophic events are not only physically devastating to patients, but they also attract medical liability and increase health care costs. Root cause analysis (RCA) has become a key tool for health care services to understand those adverse events. This study is a review of all the RCA case reports involving neurosurgical patients in New South Wales between 2008 and 2013. The case reports and data were obtained from the Clinical Excellence Commission database. The data was then categorized by the root causes identified and the recommendations suggested by the RCA committees. Thirty-two case reports were identified in the RCA database. Breaches in policy account for the majority of root causes identified, for example, delays in transfer of patients or wrong-site surgery, which always involved poor adherence to correct patient and site identification procedures. The RCA committees' recommendations included education for staff, and improvements in rostering and procedural guidelines. RCAs have improved the patient safety profile; however, the RCA committees have no power to enforce any recommendation or ensure compliance. A single RCA may provide little learning beyond the unit and staff involved. However, through aggregation of RCA data and dissemination strategies, health care workers can learn from adverse events and prevent future events from occurring. © 2015 Royal Australasian College of Surgeons.

  15. Textual analysis of tobacco editorials: how are Key media gatekeepers framing the issues?

    PubMed

    Smith, Katherine Clegg; Wakefield, Melanie

    2005-01-01

    The news media's potential to promote awareness of health issues is established, and media advocacy is now an important tool in combating tobacco use. This study examines newspaper editors' perspectives of tobacco-related issues. This study presents a textual analysis of tobacco-related editorials. The data consist of editorials on tobacco from a sample of 310 U.S. daily newspapers over the course of 1 year (2001). Data were sampled from a random one-third of the days per month, yielding 162 editorials for analysis. A qualitative textual analysis was conducted. Each editorial was coded for theme, position, and frame. We analyzed the topics gaining editorial attention and the arguments made to support various perspectives. Editorials discussed a variety of both positive and negative news events, largely conveying support for tobacco-control objectives. Various organizing frames were used-supporting policy interventions, condemning the industry, highlighting individual rights, and expressing general cynicism were most prevalent. Editors largely promoted tobacco-control efforts, particularly policy advances. There was, however, little coverage of key issues such as health effects and addiction-perhaps because they are no longer perceived to be contentious. Advocates should seek to address this area and minimize the cynicism of key media gatekeepers to avoid undermining policy and individual change efforts.

  16. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  17. Event reweighting with the NuWro neutrino interaction generator

    NASA Astrophysics Data System (ADS)

    Pickering, Luke; Stowell, Patrick; Sobczyk, Jan

    2017-09-01

    Event reweighting has been implemented in the NuWro neutrino event generator for a number of free theory parameters in the interaction model. Event reweighting is a key analysis technique, used to efficiently study the effect of neutrino interaction model uncertainties. This opens up the possibility for NuWro to be used as a primary event generator by experimental analysis groups. A preliminary model tuning to ANL and BNL data of quasi-elastic and single pion production events was performed to validate the reweighting engine.

  18. Interpretation Analysis as a Competitive Event.

    ERIC Educational Resources Information Center

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  19. 76 FR 68314 - Special Local Regulations; Key West World Championship, Atlantic Ocean; Key West, FL

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-04

    ...-AA08 Special Local Regulations; Key West World Championship, Atlantic Ocean; Key West, FL AGENCY: Coast... World Championship, a series of high-speed boat races. The event is scheduled to take place on Wednesday... Key West World Championship, a series of high-speed boat races. The event will be held on the waters...

  20. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  1. Second-Order Analysis of Semiparametric Recurrent Event Processes

    PubMed Central

    Guan, Yongtao

    2011-01-01

    Summary A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a followup period. Such data have become increasingly available in medical and epidemiological studies. In this paper, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on Meningococcal disease cases in Merseyside, UK to illustrate their practical value. PMID:21361885

  2. Physical and Chemical Analytical Analysis: A key component of Bioforensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, S P

    The anthrax letters event of 2001 has raised our awareness of the potential importance of non-biological measurements on samples of biological agents used in a terrorism incident. Such measurements include a variety of mass spectral, spectroscopic, and other instrumental techniques that are part of the current armamentarium of the modern materials analysis or analytical chemistry laboratory. They can provide morphological, trace element, isotopic, and other molecular ''fingerprints'' of the agent that may be key pieces of evidence, supplementing that obtained from genetic analysis or other biological properties. The generation and interpretation of such data represents a new domain of forensicmore » science, closely aligned with other areas of ''microbial forensics''. This paper describes some major elements of the R&D agenda that will define this sub-field in the immediate future and provide the foundations for a coherent national capability. Data from chemical and physical analysis of BW materials can be useful to an investigation of a bio-terror event in two ways. First, it can be used to compare evidence samples collected at different locations where such incidents have occurred (e.g. between the powders in the New York and Washington letters in the Amerithrax investigation) or between the attack samples and those seized during the investigation of sites where it is suspected the material was manufactured (if such samples exist). Matching of sample properties can help establish the relatedness of disparate incidents, and mis-matches might exclude certain scenarios, or signify a more complex etiology of the events under investigation. Chemical and morphological analysis for sample matching has a long history in forensics, and is likely to be acceptable in principle in court, assuming that match criteria are well defined and derived from known limits of precision of the measurement techniques in question. Thus, apart from certain operational issues (such as how

  3. Surface Management System Departure Event Data Analysis

    NASA Technical Reports Server (NTRS)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  4. Time-to-first-event versus recurrent-event analysis: points to consider for selecting a meaningful analysis strategy in clinical trials with composite endpoints.

    PubMed

    Rauch, Geraldine; Kieser, Meinhard; Binder, Harald; Bayes-Genis, Antoni; Jahn-Eimermacher, Antje

    2018-05-01

    Composite endpoints combining several event types of clinical interest often define the primary efficacy outcome in cardiologic trials. They are commonly evaluated as time-to-first-event, thereby following the recommendations of regulatory agencies. However, to assess the patient's full disease burden and to identify preventive factors or interventions, subsequent events following the first one should be considered as well. This is especially important in cohort studies and RCTs with a long follow-up leading to a higher number of observed events per patients. So far, there exist no recommendations which approach should be preferred. Recently, the Cardiovascular Round Table of the European Society of Cardiology indicated the need to investigate "how to interpret results if recurrent-event analysis results differ […] from time-to-first-event analysis" (Anker et al., Eur J Heart Fail 18:482-489, 2016). This work addresses this topic by means of a systematic simulation study. This paper compares two common analysis strategies for composite endpoints differing with respect to the incorporation of recurrent events for typical data scenarios motivated by a clinical trial. We show that the treatment effects estimated from a time-to-first-event analysis (Cox model) and a recurrent-event analysis (Andersen-Gill model) can systematically differ, particularly in cardiovascular trials. Moreover, we provide guidance on how to interpret these results and recommend points to consider for the choice of a meaningful analysis strategy. When planning trials with a composite endpoint, researchers, and regulatory agencies should be aware that the model choice affects the estimated treatment effect and its interpretation.

  5. Second-order analysis of semiparametric recurrent event processes.

    PubMed

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  6. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  7. Event Reports Promoting Root Cause Analysis.

    PubMed

    Pandit, Swananda; Gong, Yang

    2016-01-01

    Improving health is the sole objective of medical care. Unfortunately, mishaps or patient safety events happen during the care. If the safety events were collected effectively, they would help identify patterns, underlying causes, and ultimately generate proactive and remedial solutions for prevention of recurrence. Based on the AHRQ Common Formats, we examine the quality of patient safety incident reports and describe the initial data requirement that can support and accelerate effective root cause analysis. The ultimate goal is to develop a knowledge base of patient safety events and their common solutions which can be readily available for sharing and learning.

  8. Spacecraft-to-Earth Communications for Juno and Mars Science Laboratory Critical Events

    NASA Technical Reports Server (NTRS)

    Soriano, Melissa; Finley, Susan; Jongeling, Andre; Fort, David; Goodhart, Charles; Rogstad, David; Navarro, Robert

    2012-01-01

    Deep Space communications typically utilize closed loop receivers and Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK). Critical spacecraft events include orbit insertion and entry, descent, and landing.---Low gain antennas--> low signal -to-noise-ratio.---High dynamics such as parachute deployment or spin --> Doppler shift. During critical events, open loop receivers and Multiple Frequency Shift Keying (MFSK) used. Entry, Descent, Landing (EDL) Data Analysis (EDA) system detects tones in real-time.

  9. Synopsis of key persons, events, and associations in the history of Latino psychology.

    PubMed

    Padilla, Amado M; Olmedo, Esteban

    2009-10-01

    In this article, we present a brief synopsis of six early Latino psychologists, several key conferences, the establishment of research centers, and early efforts to create an association for Latino psychologists. Our chronology runs from approximately 1930 to 2000. This history is a firsthand account of how these early leaders, conferences, and efforts to bring Latinos and Latinas together served as a backdrop to current research and practice in Latino psychology. This history of individuals and events is also intertwined with the American Psychological Association and the National Institute of Mental Health and efforts by Latino psychologists to obtain the professional support necessary to lay down the roots of a Latino presence in psychology. Copyright 2009 APA, all rights reserved.

  10. Continuous variable quantum key distribution: finite-key analysis of composable security against coherent attacks.

    PubMed

    Furrer, F; Franz, T; Berta, M; Leverrier, A; Scholz, V B; Tomamichel, M; Werner, R F

    2012-09-07

    We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of runs of the protocol. This bound is valid under general coherent attacks, and gives rise to keys which are composably secure. For comparison, we also give a lower bound valid under the assumption of collective attacks. For both scenarios, we find positive key rates using experimental parameters reachable today.

  11. An application of different dioids in public key cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com

    2014-11-18

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less

  12. Finite-key security analysis of quantum key distribution with imperfect light sources

    DOE PAGES

    Mizutani, Akihiro; Curty, Marcos; Lim, Charles Ci Wen; ...

    2015-09-09

    In recent years, the gap between theory and practice in quantum key distribution (QKD) has been significantly narrowed, particularly for QKD systems with arbitrarily flawed optical receivers. The status for QKD systems with imperfect light sources is however less satisfactory, in the sense that the resulting secure key rates are often overly dependent on the quality of state preparation. This is especially the case when the channel loss is high. Very recently, to overcome this limitation, Tamaki et al proposed a QKD protocol based on the so-called 'rejected data analysis', and showed that its security in the limit of infinitelymore » long keys is almost independent of any encoding flaw in the qubit space, being this protocol compatible with the decoy state method. Here, as a step towards practical QKD, we show that a similar conclusion is reached in the finite-key regime, even when the intensity of the light source is unstable. More concretely, we derive security bounds for a wide class of realistic light sources and show that the bounds are also efficient in the presence of high channel loss. Our results strongly suggest the feasibility of long distance provably secure communication with imperfect light sources.« less

  13. Parallel Event Analysis Under Unix

    NASA Astrophysics Data System (ADS)

    Looney, S.; Nilsson, B. S.; Oest, T.; Pettersson, T.; Ranjard, F.; Thibonnier, J.-P.

    The ALEPH experiment at LEP, the CERN CN division and Digital Equipment Corp. have, in a joint project, developed a parallel event analysis system. The parallel physics code is identical to ALEPH's standard analysis code, ALPHA, only the organisation of input/output is changed. The user may switch between sequential and parallel processing by simply changing one input "card". The initial implementation runs on an 8-node DEC 3000/400 farm, using the PVM software, and exhibits a near-perfect speed-up linearity, reducing the turn-around time by a factor of 8.

  14. The comparison and analysis of extracting video key frame

    NASA Astrophysics Data System (ADS)

    Ouyang, S. Z.; Zhong, L.; Luo, R. Q.

    2018-05-01

    Video key frame extraction is an important part of the large data processing. Based on the previous work in key frame extraction, we summarized four important key frame extraction algorithms, and these methods are largely developed by comparing the differences between each of two frames. If the difference exceeds a threshold value, take the corresponding frame as two different keyframes. After the research, the key frame extraction based on the amount of mutual trust is proposed, the introduction of information entropy, by selecting the appropriate threshold values into the initial class, and finally take a similar mean mutual information as a candidate key frame. On this paper, several algorithms is used to extract the key frame of tunnel traffic videos. Then, with the analysis to the experimental results and comparisons between the pros and cons of these algorithms, the basis of practical applications is well provided.

  15. Hydrometeorological Analysis of Flooding Events in San Antonio, TX

    NASA Astrophysics Data System (ADS)

    Chintalapudi, S.; Sharif, H.; Elhassan, A.

    2008-12-01

    South Central Texas is particularly vulnerable to floods due to: proximity to a moist air source (the Gulf of Mexico); the Balcones Escarpment, which concentrates rainfall runoff; a tendency for synoptic scale features to become cut-off and stall over the area; and decaying tropical cyclones stalling over the area. The San Antonio Metropolitan Area is the 7th largest city in the nation, one of the most flash-flood prone regions in North America, and has experienced a number of flooding events in the last decade (1998, 2002, 2004, and 2007). Research is being conducted to characterize the meteorological conditions that lead to these events and apply the rainfall and watershed characteristics data to recreate the runoff events using a two- dimensional, physically-based, distributed-parameter hydrologic model. The physically based, distributed-parameter Gridded Surface Subsurface Hydrologic Analysis (GSSHA) hydrological model was used for simulating the watershed response to these storm events. Finally observed discharges were compared to GSSHA model discharges for these storm events. Analysis of the some of these events will be presented.

  16. Glaucoma progression detection: agreement, sensitivity, and specificity of expert visual field evaluation, event analysis, and trend analysis.

    PubMed

    Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier

    2013-01-01

    To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.

  17. Transcriptomic dose-and-time-course indicators of early key events in a cytotoxicity-mediated mode of action for rodent urinary bladder tumorigenesis

    EPA Science Inventory

    TRANSCRIPTOMIC DOSE- AND TIME-COURSE INDICATORS OF EARLY KEY EVENTS IN A CYTOTOXICITY-MEDIATED MODE OF ACTION FOR RODENT URINARY BLADDER TUMORIGENESISDiuron is a substituted urea compound used globally as an herbicide. Urinary bladder tumors were induced in rats after chronic die...

  18. Extreme Space Weather Events: From Cradle to Grave

    NASA Astrophysics Data System (ADS)

    Riley, Pete; Baker, Dan; Liu, Ying D.; Verronen, Pekka; Singer, Howard; Güdel, Manuel

    2018-02-01

    Extreme space weather events, while rare, can have a substantial impact on our technologically-dependent society. And, although such events have only occasionally been observed, through careful analysis of a wealth of space-based and ground-based observations, historical records, and extrapolations from more moderate events, we have developed a basic picture of the components required to produce them. Several key issues, however, remain unresolved. For example, what limits are imposed on the maximum size of such events? What are the likely societal consequences of a so-called "100-year" solar storm? In this review, we summarize our current scientific understanding about extreme space weather events as we follow several examples from the Sun, through the solar corona and inner heliosphere, across the magnetospheric boundary, into the ionosphere and atmosphere, into the Earth's lithosphere, and, finally, its impact on man-made structures and activities, such as spacecraft, GPS signals, radio communication, and the electric power grid. We describe preliminary attempts to provide probabilistic forecasts of extreme space weather phenomena, and we conclude by identifying several key areas that must be addressed if we are better able to understand, and, ultimately, predict extreme space weather events.

  19. Event shape analysis of deep inelastic scattering events with a large rapidity gap at HERA

    NASA Astrophysics Data System (ADS)

    ZEUS Collaboration; Breitweg, J.; Derrick, M.; Krakauer, D.; Magill, S.; Mikunas, D.; Musgrave, B.; Repond, J.; Stanek, R.; Talaga, R. L.; Yoshida, R.; Zhang, H.; Mattingly, M. C. K.; Anselmo, F.; Antonioli, P.; Bari, G.; Basile, M.; Bellagamba, L.; Boscherini, D.; Bruni, A.; Bruni, G.; Cara Romeo, G.; Castellini, G.; Cifarelli, L.; Cindolo, F.; Contin, A.; Corradi, M.; de Pasquale, S.; Gialas, I.; Giusti, P.; Iacobucci, G.; Laurenti, G.; Levi, G.; Margotti, A.; Massam, T.; Nania, R.; Palmonari, F.; Pesci, A.; Polini, A.; Ricci, F.; Sartorelli, G.; Zamora Garcia, Y.; Zichichi, A.; Amelung, C.; Bornheim, A.; Brock, I.; Coböken, K.; Crittenden, J.; Deffner, R.; Eckert, M.; Grothe, M.; Hartmann, H.; Heinloth, K.; Heinz, L.; Hilger, E.; Jakob, H.-P.; Katz, U. F.; Kerger, R.; Paul, E.; Pfeiffer, M.; Rembser, Ch.; Stamm, J.; Wedemeyer, R.; Wieber, H.; Bailey, D. S.; Campbell-Robson, S.; Cottingham, W. N.; Foster, B.; Hall-Wilton, R.; Hayes, M. E.; Heath, G. P.; Heath, H. F.; McFall, J. D.; Piccioni, D.; Roff, D. G.; Tapper, R. J.; Arneodo, M.; Ayad, R.; Capua, M.; Garfagnini, A.; Iannotti, L.; Schioppa, M.; Susinno, G.; Kim, J. Y.; Lee, J. H.; Lim, I. T.; Pac, M. Y.; Caldwell, A.; Cartiglia, N.; Jing, Z.; Liu, W.; Mellado, B.; Parsons, J. A.; Ritz, S.; Sampson, S.; Sciulli, F.; Straub, P. B.; Zhu, Q.; Borzemski, P.; Chwastowski, J.; Eskreys, A.; Figiel, J.; Klimek, K.; Przybycień , M. B.; Zawiejski, L.; Adamczyk, L.; Bednarek, B.; Bukowy, M.; Jeleń , K.; Kisielewska, D.; Kowalski, T.; Przybycień , M.; Rulikowska-Zarȩ Bska, E.; Suszycki, L.; Zaja C, J.; Duliń Ski, Z.; Kotań Ski, A.; Abbiendi, G.; Bauerdick, L. A. T.; Behrens, U.; Beier, H.; Bienlein, J. K.; Cases, G.; Deppe, O.; Desler, K.; Drews, G.; Fricke, U.; Gilkinson, D. J.; Glasman, C.; Göttlicher, P.; Haas, T.; Hain, W.; Hasell, D.; Johnson, K. F.; Kasemann, M.; Koch, W.; Kötz, U.; Kowalski, H.; Labs, J.; Lindemann, L.; Löhr, B.; Löwe, M.; Mań Czak, O.; Milewski, J.; Monteiro, T.; Ng, J. S. T.; Notz, D.; Ohrenberg, K.; Park, I. H.; Pellegrino, A.; Pelucchi, F.; Piotrzkowski, K.; Roco, M.; Rohde, M.; Roldán, J.; Ryan, J. J.; Savin, A. A.; Schneekloth, U.; Selonke, F.; Surrow, B.; Tassi, E.; Voß, T.; Westphal, D.; Wolf, G.; Wollmer, U.; Youngman, C.; Zsolararnecki, A. F.; Zeuner, W.; Burow, B. D.; Grabosch, H. J.; Meyer, A.; Schlenstedt, S.; Barbagli, G.; Gallo, E.; Pelfer, P.; Maccarrone, G.; Votano, L.; Bamberger, A.; Eisenhardt, S.; Markun, P.; Trefzger, T.; Wölfle, S.; Bromley, J. T.; Brook, N. H.; Bussey, P. J.; Doyle, A. T.; MacDonald, N.; Saxon, D. H.; Sinclair, L. E.; Strickland, E.; Waugh, R.; Bohnet, I.; Gendner, N.; Holm, U.; Meyer-Larsen, A.; Salehi, H.; Wick, K.; Gladilin, L. K.; Horstmann, D.; Kçira, D.; Klanner, R.; Lohrmann, E.; Poelz, G.; Schott, W.; Zetsche, F.; Bacon, T. C.; Butterworth, I.; Cole, J. E.; Howell, G.; Hung, B. H. Y.; Lamberti, L.; Long, K. R.; Miller, D. B.; Pavel, N.; Prinias, A.; Sedgbeer, J. K.; Sideris, D.; Walker, R.; Mallik, U.; Wang, S. M.; Wu, J. T.; Cloth, P.; Filges, D.; Fleck, J. I.; Ishii, T.; Kuze, M.; Suzuki, I.; Tokushuku, K.; Yamada, S.; Yamauchi, K.; Yamazaki, Y.; Hong, S. J.; Lee, S. B.; Nam, S. W.; Park, S. K.; Barreiro, F.; Fernández, J. P.; García, G.; Graciani, R.; Hernández, J. M.; Hervás, L.; Labarga, L.; Martínez, M.; del Peso, J.; Puga, J.; Terrón, J.; de Trocóniz, J. F.; Corriveau, F.; Hanna, D. S.; Hartmann, J.; Hung, L. W.; Murray, W. N.; Ochs, A.; Riveline, M.; Stairs, D. G.; St-Laurent, M.; Ullmann, R.; Tsurugai, T.; Bashkirov, V.; Dolgoshein, B. A.; Stifutkin, A.; Bashindzhagyan, G. L.; Ermolov, P. F.; Golubkov, Yu. A.; Khein, L. A.; Korotkova, N. A.; Korzhavina, I. A.; Kuzmin, V. A.; Lukina, O. Yu.; Proskuryakov, A. S.; Shcheglova, L. M.; Solomin, A. N.; Zotkin, S. A.; Bokel, C.; Botje, M.; Brümmer, N.; Chlebana, F.; Engelen, J.; Koffeman, E.; Kooijman, P.; van Sighem, A.; Tiecke, H.; Tuning, N.; Verkerke, W.; Vossebeld, J.; Vreeswijk, M.; Wiggers, L.; de Wolf, E.; Acosta, D.; Bylsma, B.; Durkin, L. S.; Gilmore, J.; Ginsburg, C. M.; Kim, C. L.; Ling, T. Y.; Nylander, P.; Romanowski, T. A.; Blaikley, H. E.; Cashmore, R. J.; Cooper-Sarkar, A. M.; Devenish, R. C. E.; Edmonds, J. K.; Große-Knetter, J.; Harnew, N.; Nath, C.; Noyes, V. A.; Quadt, A.; Ruske, O.; Tickner, J. R.; Uijterwaal, H.; Walczak, R.; Waters, D. S.; Bertolin, A.; Brugnera, R.; Carlin, R.; dal Corso, F.; Dosselli, U.; Limentani, S.; Morandin, M.; Posocco, M.; Stanco, L.; Stroili, R.; Voci, C.; Bulmahn, J.; Oh, B. Y.; Okrasiń Ski, J. R.; Toothacker, W. S.; Whitmore, J. J.; Iga, Y.; D'Agostini, G.; Marini, G.; Nigro, A.; Raso, M.; Hart, J. C.; McCubbin, N. A.; Shah, T. P.; Epperson, D.; Heusch, C.; Rahn, J. T.; Sadrozinski, H. F.-W.; Seiden, A.; Wichmann, R.; Williams, D. C.; Schwarzer, O.; Walenta, A. H.; Abramowicz, H.; Briskin, G.; Dagan, S.; Kananov, S.; Levy, A.; Abe, T.; Fusayasu, T.; Inuzuka, M.; Nagano, K.; Umemori, K.; Yamashita, T.; Hamatsu, R.; Hirose, T.; Homma, K.; Kitamura, S.; Matsushita, T.; Cirio, R.; Costa, M.; Ferrero, M. I.; Maselli, S.; Monaco, V.; Peroni, C.; Petrucci, M. C.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Dardo, M.; Bailey, D. C.; Fagerstroem, C.-P.; Galea, R.; Hartner, G. F.; Joo, K. K.; Levman, G. M.; Martin, J. F.; Orr, R. S.; Polenz, S.; Sabetfakhri, A.; Simmons, D.; Teuscher, R. J.; Butterworth, J. M.; Catterall, C. D.; Jones, T. W.; Lane, J. B.; Saunders, R. L.; Sutton, M. R.; Wing, M.; Ciborowski, J.; Grzelak, G.; Kasprzak, M.; Muchorowski, K.; Nowak, R. J.; Pawlak, J. M.; Pawlak, R.; Tymieniecka, T.; Wróblewski, A. K.; Zakrzewski, J. A.; Adamus, M.; Coldewey, C.; Eisenberg, Y.; Hochman, D.; Karshon, U.; Badgett, W. F.; Chapin, D.; Cross, R.; Dasu, S.; Foudas, C.; Loveless, R. J.; Mattingly, S.; Reeder, D. D.; Smith, W. H.; Vaiciulis, A.; Wodarczyk, M.; Deshpande, A.; Dhawan, S.; Hughes, V. W.; Bhadra, S.; Frisken, W. R.; Khakzad, M.; Schmidke, W. B.

    1998-03-01

    A global event shape analysis of the multihadronic final states observed in neutral current deep inelastic scattering events with a large rapidity gap with respect to the proton direction is presented. The analysis is performed in the range 5<=Q2<=185 GeV2 and 160<=W<=250 GeV, where Q2 is the virtuality of the photon and W is the virtual-photon proton centre of mass energy. Particular emphasis is placed on the dependence of the shape variables, measured in the γ*-pomeron rest frame, on the mass of the hadronic final state, MX. With increasing MX the multihadronic final state becomes more collimated and planar. The experimental results are compared with several models which attempt to describe diffractive events. The broadening effects exhibited by the data require in these models a significant gluon component of the pomeron.

  20. Finite-key analysis for the 1-decoy state QKD protocol

    NASA Astrophysics Data System (ADS)

    Rusca, Davide; Boaron, Alberto; Grünenfelder, Fadri; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    It has been shown that in the asymptotic case of infinite-key length, the 2-decoy state Quantum Key Distribution (QKD) protocol outperforms the 1-decoy state protocol. Here, we present a finite-key analysis of the 1-decoy method. Interestingly, we find that for practical block sizes of up to 108 bits, the 1-decoy protocol achieves for almost all experimental settings higher secret key rates than the 2-decoy protocol. Since using only one decoy is also easier to implement, we conclude that it is the best choice for QKD, in most common practical scenarios.

  1. Materials Analysis: A Key to Unlocking the Mystery of the Columbia Tragedy

    NASA Technical Reports Server (NTRS)

    Mayeaux, Brian M.; Collins, Thomas E.; Piascik, Robert S.; Russel, Richard W.; Jerman, Gregory A.; Shah, Sandeep R.; McDanels, Steven J.

    2004-01-01

    Materials analyses of key forensic evidence helped unlock the mystery of the loss of space shuttle Columbia that disintegrated February 1, 2003 while returning from a 16-day research mission. Following an intensive four-month recovery effort by federal, state, and local emergency management and law officials, Columbia debris was collected, catalogued, and reassembled at the Kennedy Space Center. Engineers and scientists from the Materials and Processes (M&P) team formed by NASA supported Columbia reconstruction efforts, provided factual data through analysis, and conducted experiments to validate the root cause of the accident. Fracture surfaces and thermal effects of selected airframe debris were assessed, and process flows for both nondestructive and destructive sampling and evaluation of debris were developed. The team also assessed left hand (LH) airframe components that were believed to be associated with a structural breach of Columbia. Analytical data collected by the M&P team showed that a significant thermal event occurred at the left wing leading edge in the proximity of LH reinforced carbon carbon (RCC) panels 8 and 9. The analysis also showed exposure to temperatures in excess of 1,649 C, which would severely degrade the support structure, tiles, and RCC panel materials. The integrated failure analysis of wing leading edge debris and deposits strongly supported the hypothesis that a breach occurred at LH RCC panel 8.

  2. Combining conversation analysis and event sequencing to study health communication.

    PubMed

    Pecanac, Kristen E

    2018-06-01

    Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.

  3. Fumonisin exposure in women linked to inhibition of an enzyme that is a key event in farm and laboratory animal diseases.

    USDA-ARS?s Scientific Manuscript database

    Fumonisin B1 (FB1) is a toxic chemical produced by molds. The molds that produce fumonisin are common in corn. Consumption of contaminated corn by farm animals has been shown to be the cause of animal disease. The proximate cause (key event) in the induction of diseases in animals is inhibition of t...

  4. Passage Key Inlet, Florida; CMS Modeling and Borrow Site Impact Analysis

    DTIC Science & Technology

    2016-06-01

    Impact Analysis by Kelly R. Legault and Sirisha Rayaprolu PURPOSE: This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the...use of a nested Coastal Modeling System (CMS) model for Passage Key Inlet, which is one of the connections between the Gulf of Mexico and Tampa Bay...driven sediment transport at Passage Key Inlet. This analysis resulted in issuing a new Florida Department of Environmental Protection (FDEP) permit to

  5. Video analysis of motor events in REM sleep behavior disorder.

    PubMed

    Frauscher, Birgit; Gschliesser, Viola; Brandauer, Elisabeth; Ulmer, Hanno; Peralta, Cecilia M; Müller, Jörg; Poewe, Werner; Högl, Birgit

    2007-07-30

    In REM sleep behavior disorder (RBD), several studies focused on electromyographic characterization of motor activity, whereas video analysis has remained more general. The aim of this study was to undertake a detailed and systematic video analysis. Nine polysomnographic records from 5 Parkinson patients with RBD were analyzed and compared with sex- and age-matched controls. Each motor event in the video during REM sleep was classified according to duration, type of movement, and topographical distribution. In RBD, a mean of 54 +/- 23.2 events/10 minutes of REM sleep (total 1392) were identified and visually analyzed. Seventy-five percent of all motor events lasted <2 seconds. Of these events, 1,155 (83.0%) were classified as elementary, 188 (13.5%) as complex behaviors, 50 (3.6%) as violent, and 146 (10.5%) as vocalizations. In the control group, 3.6 +/- 2.3 events/10 minutes (total 264) of predominantly elementary simple character (n = 240, 90.9%) were identified. Number and types of motor events differed significantly between patients and controls (P < 0.05). This study shows a very high number and great variety of motor events during REM sleep in symptomatic RBD. However, most motor events are minor, and violent episodes represent only a small fraction. Copyright 2007 Movement Disorder Society

  6. Using the DOE Knowledge Base for Special Event Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled

  7. Florbetaben PET in the Early Diagnosis of Alzheimer's Disease: A Discrete Event Simulation to Explore Its Potential Value and Key Data Gaps

    PubMed Central

    Guo, Shien; Getsios, Denis; Hernandez, Luis; Cho, Kelly; Lawler, Elizabeth; Altincatal, Arman; Lanes, Stephan; Blankenburg, Michael

    2012-01-01

    The growing understanding of the use of biomarkers in Alzheimer's disease (AD) may enable physicians to make more accurate and timely diagnoses. Florbetaben, a beta-amyloid tracer used with positron emission tomography (PET), is one of these diagnostic biomarkers. This analysis was undertaken to explore the potential value of florbetaben PET in the diagnosis of AD among patients with suspected dementia and to identify key data that are needed to further substantiate its value. A discrete event simulation was developed to conduct exploratory analyses from both US payer and societal perspectives. The model simulates the lifetime course of disease progression for individuals, evaluating the impact of their patient management from initial diagnostic work-up to final diagnosis. Model inputs were obtained from specific analyses of a large longitudinal dataset from the New England Veterans Healthcare System and supplemented with data from public data sources and assumptions. The analyses indicate that florbetaben PET has the potential to improve patient outcomes and reduce costs under certain scenarios. Key data on the use of florbetaben PET, such as its influence on time to confirmation of final diagnosis, treatment uptake, and treatment persistency, are unavailable and would be required to confirm its value. PMID:23326754

  8. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    NASA Astrophysics Data System (ADS)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  9. Finite-key analysis for measurement-device-independent quantum key distribution.

    PubMed

    Curty, Marcos; Xu, Feihu; Cui, Wei; Lim, Charles Ci Wen; Tamaki, Kiyoshi; Lo, Hoi-Kwong

    2014-04-29

    Quantum key distribution promises unconditionally secure communications. However, as practical devices tend to deviate from their specifications, the security of some practical systems is no longer valid. In particular, an adversary can exploit imperfect detectors to learn a large part of the secret key, even though the security proof claims otherwise. Recently, a practical approach--measurement-device-independent quantum key distribution--has been proposed to solve this problem. However, so far its security has only been fully proven under the assumption that the legitimate users of the system have unlimited resources. Here we fill this gap and provide a rigorous security proof against general attacks in the finite-key regime. This is obtained by applying large deviation theory, specifically the Chernoff bound, to perform parameter estimation. For the first time we demonstrate the feasibility of long-distance implementations of measurement-device-independent quantum key distribution within a reasonable time frame of signal transmission.

  10. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    DTIC Science & Technology

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  11. Finite key analysis for symmetric attacks in quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Tim; Kampermann, Hermann; Kleinmann, Matthias

    2006-10-15

    We introduce a constructive method to calculate the achievable secret key rate for a generic class of quantum key distribution protocols, when only a finite number n of signals is given. Our approach is applicable to all scenarios in which the quantum state shared by Alice and Bob is known. In particular, we consider the six state protocol with symmetric eavesdropping attacks, and show that for a small number of signals, i.e., below n{approx}10{sup 4}, the finite key rate differs significantly from the asymptotic value for n{yields}{infinity}. However, for larger n, a good approximation of the asymptotic value is found.more » We also study secret key rates for protocols using higher-dimensional quantum systems.« less

  12. Regression analysis of mixed recurrent-event and panel-count data

    PubMed Central

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.

    2014-01-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20, 1–42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. PMID:24648408

  13. Event time analysis of longitudinal neuroimage data.

    PubMed

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Reaching Out: A Break from Traditional Forensic Events. "On Interpretation Analysis."

    ERIC Educational Resources Information Center

    Seney, Ronald J.

    In recent years a new event called "Interpretation Analysis" has appeared at certain forensic events. The objective is for the student, through analysis and performance, to study a piece of literature and to communicate his or her understanding of that literature to a specific audience. Perhaps there is room within the established…

  15. An analysis of the 2016 Hitomi breakup event

    NASA Astrophysics Data System (ADS)

    Flegel, Sven; Bennett, James; Lachut, Michael; Möckel, Marek; Smith, Craig

    2017-04-01

    The breakup of Hitomi (ASTRO-H) on 26 March 2016 is analysed. Debris from the fragmentation is used to estimate the time of the event by propagating backwards and estimating the close approach with the parent object. Based on this method, the breakup event is predicted to have occurred at approximately 01:42 UTC on 26 March 2016. The Gaussian variation of parameters equations based on the instantaneous orbits at the predicted time of the event are solved to gain additional insight into the on-orbit position of Hitomi at the time of the event and to test an alternate approach of determining the event epoch and location. A conjunction analysis is carried out between Hitomi and all catalogued objects which were in orbit around the estimated time of the anomaly. Several debris objects have close approaches with Hitomi; however, there is no evidence to support the breakup was caused by a catalogued object. Debris from both of the largest fragmentation events—the Iridium 33-Cosmos 2251 conjunction in 2009 and the intentional destruction of Fengyun 1C in 2007—is involved in close approaches with Hitomi indicating the persistent threat these events have caused in subsequent space missions. To quantify the magnitude of a potential conjunction, the fragmentation resulting from a collision with the debris is modelled using the EVOLVE-4 breakup model. The debris characteristics are estimated from two-line element data. This analysis is indicative of the threat to space assets that mission planners face due to the growing debris population. The impact of the actual event to the environment is investigated based on the debris associated with Hitomi which is currently contained in the United States Strategic Command's catalogue. A look at the active missions in the orbital vicinity of Hitomi reveals that the Hubble Space Telescope is among the spacecraft which may be immediately affected by the new debris.[Figure not available: see fulltext.

  16. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy)

    NASA Astrophysics Data System (ADS)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela; Perrotta, Piero; Russo, Luigi; Tansi, Carlo

    2017-11-01

    Calabria (southern Italy) is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  17. Regression analysis of mixed recurrent-event and panel-count data.

    PubMed

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Root Cause Analysis: Learning from Adverse Safety Events.

    PubMed

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  19. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  20. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  1. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  2. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  3. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  4. Meta-analysis: Association between hypoglycaemia and serious adverse events in older patients.

    PubMed

    Mattishent, Katharina; Loke, Yoon Kong

    2016-07-01

    We aimed to conduct a meta-analysis of serious adverse events (macro- and microvascular events, falls and fractures, death) associated with hypoglycaemia in older patients. We searched MEDLINE and EMBASE spanning a ten-year period up to March 2015 (with automated PubMed updates to October 2015). We selected observational studies reporting on hypoglycaemia and associated serious adverse events, and conducted a meta-analysis. We assessed study validity based on ascertainment of hypoglycaemia, adverse events and adjustment for confounders. We included 17 studies involving 1.86 million participants. Meta-analysis of eight studies demonstrated that hypoglycemic episodes were associated with macrovascular complications, odds ratio (OR) 1.83 (95% confidence interval [CI] 1.64, 2.05), and microvascular complications in two studies OR 1.77 (95% CI 1.49, 2.10). Meta-analysis of four studies demonstrated an association between hypoglycaemia and falls or fractures, OR 1.89 (95% CI 1.54, 2.32) and 1.92 (95% CI 1.56, 2.38) respectively. Hypoglycaemia was associated with increased likelihood of death in a meta-analysis of eight studies, OR 2.04 (95% Confidence Interval 1.68, 2.47). Our meta-analysis raises major concerns about a range of serious adverse events associated with hypoglycaemia. Clinicians should prioritize individualized therapy and closer monitoring strategies to avoid hypoglycaemia in susceptible older patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Performance Analysis: Work Control Events Identified January - August 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, C E; Freeman, J W; Kerr, C E

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun

  6. An analysis of post-event processing in social anxiety disorder.

    PubMed

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  7. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  8. Identification of key regulators of pancreatic cancer progression through multidimensional systems-level analysis.

    PubMed

    Rajamani, Deepa; Bhasin, Manoj K

    2016-05-03

    Pancreatic cancer is an aggressive cancer with dismal prognosis, urgently necessitating better biomarkers to improve therapeutic options and early diagnosis. Traditional approaches of biomarker detection that consider only one aspect of the biological continuum like gene expression alone are limited in their scope and lack robustness in identifying the key regulators of the disease. We have adopted a multidimensional approach involving the cross-talk between the omics spaces to identify key regulators of disease progression. Multidimensional domain-specific disease signatures were obtained using rank-based meta-analysis of individual omics profiles (mRNA, miRNA, DNA methylation) related to pancreatic ductal adenocarcinoma (PDAC). These domain-specific PDAC signatures were integrated to identify genes that were affected across multiple dimensions of omics space in PDAC (genes under multiple regulatory controls, GMCs). To further pin down the regulators of PDAC pathophysiology, a systems-level network was generated from knowledge-based interaction information applied to the above identified GMCs. Key regulators were identified from the GMC network based on network statistics and their functional importance was validated using gene set enrichment analysis and survival analysis. Rank-based meta-analysis identified 5391 genes, 109 miRNAs and 2081 methylation-sites significantly differentially expressed in PDAC (false discovery rate ≤ 0.05). Bimodal integration of meta-analysis signatures revealed 1150 and 715 genes regulated by miRNAs and methylation, respectively. Further analysis identified 189 altered genes that are commonly regulated by miRNA and methylation, hence considered GMCs. Systems-level analysis of the scale-free GMCs network identified eight potential key regulator hubs, namely E2F3, HMGA2, RASA1, IRS1, NUAK1, ACTN1, SKI and DLL1, associated with important pathways driving cancer progression. Survival analysis on individual key regulators revealed

  9. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    PubMed

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Radionuclide data analysis in connection of DPRK event in May 2009

    NASA Astrophysics Data System (ADS)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  11. Regression Analysis of Mixed Panel Count Data with Dependent Terminal Events

    PubMed Central

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L.

    2017-01-01

    Event history studies are commonly conducted in many fields and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data above, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally the methodology is applied to a childhood cancer study that motivated this study. PMID:28098397

  12. Re-Evaluation of Event Correlations in Virtual California Using Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Heflin, M. B.; Granat, R. A.; Yikilmaz, M. B.; Heien, E.; Rundle, J.; Donnellan, A.

    2010-12-01

    Fusing the results of simulation tools with statistical analysis methods has contributed to our better understanding of the earthquake process. In a previous study, we used a statistical method to investigate emergent phenomena in data produced by the Virtual California earthquake simulator. The analysis indicated that there were some interesting fault interactions and possible triggering and quiescence relationships between events. We have converted the original code from Matlab to python/C++ and are now evaluating data from the most recent version of Virtual California in order to analyze and compare any new behavior exhibited by the model. The Virtual California earthquake simulator can be used to study fault and stress interaction scenarios for realistic California earthquakes. The simulation generates a synthetic earthquake catalog of events with a minimum size of ~M 5.8 that can be evaluated using statistical analysis methods. Virtual California utilizes realistic fault geometries and a simple Amontons - Coulomb stick and slip friction law in order to drive the earthquake process by means of a back-slip model where loading of each segment occurs due to the accumulation of a slip deficit at the prescribed slip rate of the segment. Like any complex system, Virtual California may generate emergent phenomena unexpected even by its designers. In order to investigate this, we have developed a statistical method that analyzes the interaction between Virtual California fault elements and thereby determine whether events on any given fault elements show correlated behavior. Our method examines events on one fault element and then determines whether there is an associated event within a specified time window on a second fault element. Note that an event in our analysis is defined as any time an element slips, rather than any particular “earthquake” along the entire fault length. Results are then tabulated and then differenced with an expected correlation

  13. Biological event composition

    PubMed Central

    2012-01-01

    Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes

  14. Finite-size analysis of a continuous-variable quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leverrier, Anthony; Grosshans, Frederic; Grangier, Philippe

    2010-06-15

    The goal of this paper is to extend the framework of finite-size analysis recently developed for quantum key distribution to continuous-variable protocols. We do not solve this problem completely here, and we mainly consider the finite-size effects on the parameter estimation procedure. Despite the fact that some questions are left open, we are able to give an estimation of the secret key rate for protocols which do not contain a postselection procedure. As expected, these results are significantly more pessimistic than those obtained in the asymptotic regime. However, we show that recent continuous-variable protocols are able to provide fully securemore » secret keys in the finite-size scenario, over distances larger than 50 km.« less

  15. Detection and analysis of microseismic events using a Matched Filtering Algorithm (MFA)

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Eaton, David W.; Jones, Joshua P.; van der Baan, Mirko

    2016-07-01

    A new Matched Filtering Algorithm (MFA) is proposed for detecting and analysing microseismic events recorded by downhole monitoring of hydraulic fracturing. This method requires a set of well-located template (`parent') events, which are obtained using conventional microseismic processing and selected on the basis of high signal-to-noise (S/N) ratio and representative spatial distribution of the recorded microseismicity. Detection and extraction of `child' events are based on stacked, multichannel cross-correlation of the continuous waveform data, using the parent events as reference signals. The location of a child event relative to its parent is determined using an automated process, by rotation of the multicomponent waveforms into the ray-centred co-ordinates of the parent and maximizing the energy of the stacked amplitude envelope within a search volume around the parent's hypocentre. After correction for geometrical spreading and attenuation, the relative magnitude of the child event is obtained automatically using the ratio of stacked envelope peak with respect to its parent. Since only a small number of parent events require interactive analysis such as picking P- and S-wave arrivals, the MFA approach offers the potential for significant reduction in effort for downhole microseismic processing. Our algorithm also facilitates the analysis of single-phase child events, that is, microseismic events for which only one of the S- or P-wave arrivals is evident due to unfavourable S/N conditions. A real-data example using microseismic monitoring data from four stages of an open-hole slickwater hydraulic fracture treatment in western Canada demonstrates that a sparse set of parents (in this case, 4.6 per cent of the originally located events) yields a significant (more than fourfold increase) in the number of located events compared with the original catalogue. Moreover, analysis of the new MFA catalogue suggests that this approach leads to more robust interpretation

  16. Analysis of counterfactual quantum key distribution using error-correcting theory

    NASA Astrophysics Data System (ADS)

    Li, Yan-Bing

    2014-10-01

    Counterfactual quantum key distribution is an interesting direction in quantum cryptography and has been realized by some researchers. However, it has been pointed that its insecure in information theory when it is used over a high lossy channel. In this paper, we retry its security from a error-correcting theory point of view. The analysis indicates that the security flaw comes from the reason that the error rate in the users' raw key pair is as high as that under the Eve's attack when the loss rate exceeds 50 %.

  17. External events analysis for the Savannah River Site K reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 {times} 10{sup {minus}4} per year, from which seismic events are the major contributor (1.2 {times} 10{supmore » {minus}4} per year). Fire initiated events contribute 1.4 {times} 10{sup {minus}7} per year, tornados 5.8 {times} 10{sup {minus}7} per year, dam failures 1.5 {times} 10{sup {minus}6} per year and the crane failure scenario less than 10{sup {minus}4} per year to the core melt frequency. 8 refs., 3 figs., 5 tabs.« less

  18. Regression analysis of mixed panel count data with dependent terminal events.

    PubMed

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  20. Experimental quantum key distribution with finite-key security analysis for noisy channels.

    PubMed

    Bacco, Davide; Canale, Matteo; Laurenti, Nicola; Vallone, Giuseppe; Villoresi, Paolo

    2013-01-01

    In quantum key distribution implementations, each session is typically chosen long enough so that the secret key rate approaches its asymptotic limit. However, this choice may be constrained by the physical scenario, as in the perspective use with satellites, where the passage of one terminal over the other is restricted to a few minutes. Here we demonstrate experimentally the extraction of secure keys leveraging an optimal design of the prepare-and-measure scheme, according to recent finite-key theoretical tight bounds. The experiment is performed in different channel conditions, and assuming two distinct attack models: individual attacks or general quantum attacks. The request on the number of exchanged qubits is then obtained as a function of the key size and of the ambient quantum bit error rate. The results indicate that viable conditions for effective symmetric, and even one-time-pad, cryptography are achievable.

  1. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  2. An Event Restriction Interval Theory of Tense

    ERIC Educational Resources Information Center

    Beamer, Brandon Robert

    2012-01-01

    This dissertation presents a novel theory of tense and tense-like constructions. It is named after a key theoretical component of the theory, the event restriction interval. In Event Restriction Interval (ERI) Theory, sentences are semantically evaluated relative to an index which contains two key intervals, the evaluation interval and the event…

  3. Markov chains and semi-Markov models in time-to-event analysis.

    PubMed

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  4. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  5. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  6. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  7. Drought Events and Their Impacts on Food Production in New Zealand: Historical Analysis and Outlook Model Development

    NASA Astrophysics Data System (ADS)

    Li, Y.; Yin, C.; Urich, P.; Hill, R.

    2012-12-01

    Given the importance of the primary production sector, climatic conditions have always been a significant driver of food production in New Zealand. The country has experienced a number of severe droughts throughout its history, where a number of extended periods of low rainfall have severely impacted primary production. The characteristics of historical drought and their impacts on the primary production sector are analysed, including the economic losses in the 1998-1999 and 2007-2009 events. We include the analysis of a set of national standardised drought monitoring indices: Standardised Precipitation Index (SPI), Standardised Precipitation Evapotranspiration Index (SPEI), Soil moisture Index (SMI), and Standardised Pasture Growth Index (SPGI). Since the drought events in New Zealand are clearly linked with ENSO, the SST anomalies in the key regions can be good predictors of drought events. Artificial Neural Network (ANN) information processing technics have been applied to build local drought outlook models, the predictors are the SST anomaly of eight key regions that impact New Zealand climate produced by the Climate Forecasting System v2(CFSv2) of NCEP, and the local NIWA derived observed precipitation and soil moisture data. SST is a variable that CFSv2 can forecast with high skill and after bias correction, can be applied as a climate predictor for New Zealand. Inclusion of local data and the persistent nature of drought leads to good predictors therefore one to three month ensemble drought outlooks can be produced for New Zealand. The potential changes of drought intensity and frequency over the medium to long term future are investigated using downscaled data from 12 GCMs and multiple scenarios. The results indicate that New Zealand may experience more severe drought in many areas, therefore adaptation should be planned and implemented.

  8. Analysis and visualization of single-trial event-related potentials

    NASA Technical Reports Server (NTRS)

    Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.

    2001-01-01

    In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image

  9. [Analysis on the adverse events of cupping therapy in the application].

    PubMed

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  10. AOP: An R Package For Sufficient Causal Analysis in Pathway ...

    EPA Pesticide Factsheets

    Summary: How can I quickly find the key events in a pathway that I need to monitor to predict that a/an beneficial/adverse event/outcome will occur? This is a key question when using signaling pathways for drug/chemical screening in pharma-cology, toxicology and risk assessment. By identifying these sufficient causal key events, we have fewer events to monitor for a pathway, thereby decreasing assay costs and time, while maximizing the value of the information. I have developed the “aop” package which uses backdoor analysis of causal net-works to identify these minimal sets of key events that are suf-ficient for making causal predictions. Availability and Implementation: The source and binary are available online through the Bioconductor project (http://www.bioconductor.org/) as an R package titled “aop”. The R/Bioconductor package runs within the R statistical envi-ronment. The package has functions that can take pathways (as directed graphs) formatted as a Cytoscape JSON file as input, or pathways can be represented as directed graphs us-ing the R/Bioconductor “graph” package. The “aop” package has functions that can perform backdoor analysis to identify the minimal set of key events for making causal predictions.Contact: burgoon.lyle@epa.gov This paper describes an R/Bioconductor package that was developed to facilitate the identification of key events within an AOP that are the minimal set of sufficient key events that need to be tested/monit

  11. ITS risk analysis.

    DOT National Transportation Integrated Search

    1996-06-01

    Risk analysis plays a key role in the implementation of an architecture. Early definition of the situations, processes, or events that have the potential for impeding the implementation of key elements of the ITS National Architecture is a critical e...

  12. An unjustified benefit: immortal time bias in the analysis of time-dependent events.

    PubMed

    Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg

    2018-02-01

    Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.

  13. The effect of species and colony size on the bleaching response of reef-building corals in the Florida Keys during the 2005 mass bleaching event

    NASA Astrophysics Data System (ADS)

    Brandt, M. E.

    2009-12-01

    Understanding the variation in coral bleaching response is necessary for making accurate predictions of population changes and the future state of reefs in a climate of increasing thermal stress events. Individual coral colonies, belonging to inshore patch reef communities of the Florida Keys, were followed through the 2005 mass bleaching event. Overall, coral bleaching patterns followed an index of accumulated thermal stress more closely than in situ temperature measurements. Eight coral species ( Colpophyllia natans, Diploria strigosa, Montastraea cavernosa, M. faveolata, Porites astreoides, P. porites, Siderastrea siderea, and Stephanocoenia intersepta), representing >90% of the coral colonies studied, experienced intense levels of bleaching, but responses varied. Bleaching differed significantly among species: Colpophyllia natans and Diploria strigosa were most susceptible to thermal stress, while Stephanocoenia intersepta was the most tolerant. For colonies of C. natans, M. faveolata, and S. siderea, larger colonies experienced more extensive bleaching than smaller colonies. The inshore patch reef communities of the Florida Keys have historically been dominated by large colonies of Montastraea sp. and Colpophyllia natans. These results provide evidence that colony-level differences can affect bleaching susceptibility in this habitat and suggest that the impact of future thermal stress events may be biased toward larger colonies of dominant reef-building species. Predicted increases in the frequency of mass bleaching and subsequent mortality may therefore result in significant structural shifts of these ecologically important communities.

  14. Gunbarrel mafic magmatic event: A key 780 Ma time marker for Rodinia plate reconstructions

    USGS Publications Warehouse

    Harlan, S.S.; Heaman, L.; LeCheminant, A.N.; Premo, W.R.

    2003-01-01

    Precise U-Pb baddeleyite dating of mafic igneous rocks provides evidence for a widespread and synchronous magmatic event that extended for >2400 km along the western margin of the Neoproterozoic Laurentian craton. U-Pb baddeleyite analyses for eight intrusions from seven localities ranging from the northern Canadian Shield to northwestern Wyoming-southwestern Montana are statistically indistinguishable and yield a composite U-Pb concordia age for this event of 780.3 ?? 1.4 Ma (95% confidence level). This 780 Ma event is herein termed the Gunbarrel magmatic event. The mafic magmatism of the Gunbarrel event represents the largest mafic dike swarm yet identified along the Neoproterozoic margin of Laurentia. The origin of the mafic magmatism is not clear, but may be related to mantle-plume activity or upwelling asthenosphere leading to crustal extension accompanying initial breakup of the supercontinent Rodinia and development of the proto-Pacific Ocean. The mafic magmatism of the Gunbarrel magmatic event at 780 Ma predates the voluminous magmatism of the 723 Ma Franklin igneous event of the northwestern Canadian Shield by ???60 m.y. The precise dating of the extensive Neoproterozoic Gunbarrel and Franklin magmatic events provides unique time markers that can ultimately be used for robust testing of Neoproterozoic continental reconstructions.

  15. Genome sequence analysis of dengue virus 1 isolated in Key West, Florida.

    PubMed

    Shin, Dongyoung; Richards, Stephanie L; Alto, Barry W; Bettinardi, David J; Smartt, Chelsea T

    2013-01-01

    Dengue virus (DENV) is transmitted to humans through the bite of mosquitoes. In November 2010, a dengue outbreak was reported in Monroe County in southern Florida (FL), including greater than 20 confirmed human cases. The virus collected from the human cases was verified as DENV serotype 1 (DENV-1) and one isolate was provided for sequence analysis. RNA was extracted from the DENV-1 isolate and was used in reverse transcription polymerase chain reaction (RT-PCR) to amplify PCR fragments to sequence. Nucleic acid primers were designed to generate overlapping PCR fragments that covered the entire genome. The DENV-1 isolate found in Key West (KW), FL was sequenced for whole genome characterization. Sequence assembly, Genbank searches, and recombination analyses were performed to verify the identity of the genome sequences and to determine percent similarity to known DENV-1 sequences. We show that the KW DENV-1 strain is 99% identical to Nicaraguan and Mexican DENV-1 strains. Phylogenetic and recombination analyses suggest that the DENV-1 isolated in KW originated from Nicaragua (NI) and the KW strain may circulate in KW. Also, recombination analysis results detected recombination events in the KW strain compared to DENV-1 strains from Puerto Rico. We evaluate the relative growth of KW strain of DENV-1 compared to other dengue viruses to determine whether the underlying genetics of the strain is associated with a replicative advantage, an important consideration since local transmission of DENV may result because domestic tourism can spread DENVs.

  16. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  17. Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.

    2013-12-01

    Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web

  18. The Keys to the White House

    ERIC Educational Resources Information Center

    Lichtman, Allan J.

    2012-01-01

    The Keys to the White House is a historically-based system for predicting the result of the popular vote in American presidential elections. The Keys system tracks the big picture of how well the party holding the White House has governed and does not shift with events of the campaign. This model gives specificity to the idea that it is…

  19. A high dose mode of action for tetrabromobisphenol A-induced uterine adenocarcinomas in Wistar Han rats: A critical evaluation of key events in an adverse outcome pathway framework.

    PubMed

    Wikoff, D S; Rager, J E; Haws, L C; Borghoff, S J

    2016-06-01

    TBBPA is a non-genotoxic flame retardant used to improve fire safety in a wide variety of consumer products. Estimated human exposures to TBBPA are very low (<0.000084 mg/kg-day), relative to the doses (500 and 1000 mg/kg-day of TBBPA) administered in a recent bioassay that resulted in uterine tumors in Wistar Han rats following chronic exposure. As part of an effort to characterize the relevance of the uterine tumors to humans, data and biological knowledge relevant to the progression of events associated with TBBPA-induced uterine tumors in female rats were organized in an adverse outcome pathway (AOP) framework. Based on a review of possible MOAs for chemically induced uterine tumors and available TBBPA data sets, a plausible molecular initiating event (MIE) was the ability of TBBPA to bind to and inhibit estrogen sulfotransferases, the enzymes responsible for sulfation of estradiol. Subsequent key events in the AOP, including increased bioavailability of unconjugated estrogens in uterine tissue, would occur as a result of decreased sulfation, leading to a disruption in estrogen homeostasis, increased expression of estrogen responsive genes, cell proliferation, and hyperplasia. Available data support subsequent key events, including generation of reactive quinones from the metabolism of estrogens, followed by DNA damage that could contribute to the development of uterine tumors. Uncertainties associated with human relevance are highlighted by potential strain/species sensitivities to development of uterine tumors, as well as the characterization of a dose-dependent MIE. For the latter, it was determined that the TBBPA metabolic profile is altered at high doses (such as those used in the cancer bioassay), and thus an MIE that is only operative under repeated high dose, administration. The MIE and subsequent key events for the development of TBBPA-induced uterine tumors are not feasible in humans given differences in the kinetic and dynamic factors associated

  20. Radar rainfall estimation in the context of post-event analysis of flash-flood events

    NASA Astrophysics Data System (ADS)

    Delrieu, G.; Bouilloud, L.; Boudevillain, B.; Kirstetter, P.-E.; Borga, M.

    2009-09-01

    This communication is about a methodology for radar rainfall estimation in the context of post-event analysis of flash-flood events developed within the HYDRATE project. For such extreme events, some raingauge observations (operational, amateur) are available at the event time scale, while few raingauge time series are generally available at the hydrologic time steps. Radar data is therefore the only way to access to the rainfall space-time organization, but the quality of the radar data may be highly variable as a function of (1) the relative locations of the event and the radar(s) and (2) the radar operating protocol(s) and maintenance. A positive point: heavy rainfall is associated with convection implying better visibility and lesser bright band contamination compared with more current situations. In parallel with the development of a regionalized and adaptive radar data processing system (TRADHy; Delrieu et al. 2009), a pragmatic approach is proposed here to make best use of the available radar and raingauge data for a given flash-flood event by: (1) Identifying and removing residual ground clutter, (2) Applying the "hydrologic visibility" concept (Pellarin et al. 2002) to correct for range-dependent errors (screening and VPR effects for non-attenuating wavelengths, (3) Estimating an effective Z-R relationship through a radar-raingauge optimization approach to remove the mean field bias (Dinku et al. 2002) A sensitivity study, based on the high-quality volume radar datasets collected during two intense rainfall events of the Bollène 2002 experiment (Delrieu et al. 2009), is first proposed. Then the method is implemented for two other historical events occurred in France (Avène 1997 and Aude 1999) with datasets of lesser quality. References: Delrieu, G., B. Boudevillain, J. Nicol, B. Chapon, P.-E. Kirstetter, H. Andrieu, and D. Faure, 2009: Bollène 2002 experiment: radar rainfall estimation in the Cévennes-Vivarais region, France. Journal of Applied

  1. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  2. Necroptosis is a key pathogenic event in human and experimental murine models of non-alcoholic steatohepatitis.

    PubMed

    Afonso, Marta B; Rodrigues, Pedro M; Carvalho, Tânia; Caridade, Marta; Borralho, Paula; Cortez-Pinto, Helena; Castro, Rui E; Rodrigues, Cecília M P

    2015-10-01

    Hepatocyte cell death, inflammation and oxidative stress constitute key pathogenic mechanisms underlying non-alcoholic fatty liver disease (NAFLD). We aimed to investigate the role of necroptosis in human and experimental NAFLD and its association with tumour necrosis factor α (TNF-α) and oxidative stress. Serum markers of necrosis, liver receptor-interacting protein 3 (RIP3) and phosphorylated mixed lineage kinase domain-like (MLKL) were evaluated in control individuals and patients with NAFLD. C57BL/6 wild-type (WT) or RIP3-deficient (RIP3(-/-)) mice were fed a high-fat choline-deficient (HFCD) or methionine and choline-deficient (MCD) diet, with subsequent histological and biochemical analysis of hepatic damage. In primary murine hepatocytes, necroptosis and oxidative stress were also assessed after necrostatin-1 (Nec-1) treatment or RIP3 silencing. We show that circulating markers of necrosis and TNF-α, as well as liver RIP3 and MLKL phosphorylation were increased in NAFLD. Likewise, RIP3 and MLKL protein levels and TNF-α expression were increased in the liver of HFCD and MCD diet-fed mice. Moreover, RIP3 and MLKL sequestration in the insoluble protein fraction of NASH (non-alcoholic steatohepatitis) mice liver lysates represented an early event during stetatohepatitis progression. Functional studies in primary murine hepatocytes established the association between TNF-α-induced RIP3 expression, activation of necroptosis and oxidative stress. Strikingly, RIP3 deficiency attenuated MCD diet-induced liver injury, steatosis, inflammation, fibrosis and oxidative stress. In conclusion, necroptosis is increased in the liver of NAFLD patients and in experimental models of NASH. Further, TNF-α triggers RIP3-dependent oxidative stress during hepatocyte necroptosis. As such, targeting necroptosis appears to arrest or at least impair NAFLD progression. © 2015 Authors; published by Portland Press Limited.

  3. Meta-Analysis of Rare Binary Adverse Event Data

    PubMed Central

    Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.

    2013-01-01

    We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068

  4. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  5. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    PubMed

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  6. Event-based stormwater management pond runoff temperature model

    NASA Astrophysics Data System (ADS)

    Sabouri, F.; Gharabaghi, B.; Sattar, A. M. A.; Thompson, A. M.

    2016-09-01

    Stormwater management wet ponds are generally very shallow and hence can significantly increase (about 5.4 °C on average in this study) runoff temperatures in summer months, which adversely affects receiving urban stream ecosystems. This study uses gene expression programming (GEP) and artificial neural networks (ANN) modeling techniques to advance our knowledge of the key factors governing thermal enrichment effects of stormwater ponds. The models developed in this study build upon and compliment the ANN model developed by Sabouri et al. (2013) that predicts the catchment event mean runoff temperature entering the pond as a function of event climatic and catchment characteristic parameters. The key factors that control pond outlet runoff temperature, include: (1) Upland Catchment Parameters (catchment drainage area and event mean runoff temperature inflow to the pond); (2) Climatic Parameters (rainfall depth, event mean air temperature, and pond initial water temperature); and (3) Pond Design Parameters (pond length-to-width ratio, pond surface area, pond average depth, and pond outlet depth). We used monitoring data for three summers from 2009 to 2011 in four stormwater management ponds, located in the cities of Guelph and Kitchener, Ontario, Canada to develop the models. The prediction uncertainties of the developed ANN and GEP models for the case study sites are around 0.4% and 1.7% of the median value. Sensitivity analysis of the trained models indicates that the thermal enrichment of the pond outlet runoff is inversely proportional to pond length-to-width ratio, pond outlet depth, and directly proportional to event runoff volume, event mean pond inflow runoff temperature, and pond initial water temperature.

  7. Application of a temporal reasoning framework tool in analysis of medical device adverse events.

    PubMed

    Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.

  8. Differential Fault Analysis on CLEFIA with 128, 192, and 256-Bit Keys

    NASA Astrophysics Data System (ADS)

    Takahashi, Junko; Fukunaga, Toshinori

    This paper describes a differential fault analysis (DFA) attack against CLEFIA. The proposed attack can be applied to CLEFIA with all supported keys: 128, 192, and 256-bit keys. DFA is a type of side-channel attack. This attack enables the recovery of secret keys by injecting faults into a secure device during its computation of the cryptographic algorithm and comparing the correct ciphertext with the faulty one. CLEFIA is a 128-bit blockcipher with 128, 192, and 256-bit keys developed by the Sony Corporation in 2007. CLEFIA employs a generalized Feistel structure with four data lines. We developed a new attack method that uses this characteristic structure of the CLEFIA algorithm. On the basis of the proposed attack, only 2 pairs of correct and faulty ciphertexts are needed to retrieve the 128-bit key, and 10.78 pairs on average are needed to retrieve the 192 and 256-bit keys. The proposed attack is more efficient than any previously reported. In order to verify the proposed attack and estimate the calculation time to recover the secret key, we conducted an attack simulation using a PC. The simulation results show that we can obtain each secret key within three minutes on average. This result shows that we can obtain the entire key within a feasible computational time.

  9. Urbanization and Fertility: An Event-History Analysis of Coastal Ghana

    PubMed Central

    WHITE, MICHAEL J.; MUHIDIN, SALUT; ANDRZEJEWSKI, CATHERINE; TAGOE, EVA; KNIGHT, RODNEY; REED, HOLLY

    2008-01-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself. Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field. PMID:19110898

  10. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    NASA Astrophysics Data System (ADS)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  11. Development and assessment of stressful life events subscales - A preliminary analysis.

    PubMed

    Buccheri, Teresa; Musaad, Salma; Bost, Kelly K; Fiese, Barbara H

    2018-01-15

    Stress affects people of all ages, genders, and cultures and is associated with physical and psychological complications. Stressful life events are an important research focus and a psychometrically valid measure could provide useful clinical information. The purpose of the study was to develop a reliable and valid measurement of stressful life events and to assess its reliability and validity using established measures of social support, stress, depression, anxiety and maternal and child health. The authors used an adaptation from the Social Readjustment Rating Scale (SRRS) to describe the prevalence of life events; they developed a 4-factor stressful life events subscales and used Medical Outcomes Social Support Scale, Social Support Scale, Depression, Anxiety and Stress Scale and 14 general health items for validity analysis. Analyses were performed with descriptive statistics, Cronbach's alpha, Spearman's rho, Chi-square test or Fisher's exact test and Wilcoxon 2-sample test. The 4-factor stressful life events subscales showed acceptable reliability. The resulting subscale scores were significantly associated with established measures of social support, depression, anxiety, stress, and caregiver health indicators. The study presented a number of limitations in terms of design and recall bias. Despite the presence of a number of limitations, the study provided valuable insight and suggested that further investigation is needed in order to determine the effectiveness of the measures in revealing the family's wellbeing and to develop and strengthen a more detailed analysis of the stressful life events/health association. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    NASA Astrophysics Data System (ADS)

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  13. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lisbeth A.

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  14. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    NASA Astrophysics Data System (ADS)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  15. Analysis of Loss-of-Offsite-Power Events 1997-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Nancy Ellen; Schroeder, John Alton

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations weremore » determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.« less

  16. Analysis of event data recorder data for vehicle safety improvement

    DOT National Transportation Integrated Search

    2008-04-01

    The Volpe Center performed a comprehensive engineering analysis of Event Data Recorder (EDR) data supplied by the National Highway Traffic Safety Administration (NHTSA) to assess its accuracy and usefulness in crash reconstruction and improvement of ...

  17. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    PubMed

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.

  18. Classification and Space-Time Analysis of Precipitation Events in Manizales, Caldas, Colombia.

    NASA Astrophysics Data System (ADS)

    Suarez Hincapie, J. N.; Vélez, J.; Romo Melo, L.; Chang, P.

    2015-12-01

    Manizales is a mid-mountain Andean city located near the Nevado del Ruiz volcano in west-central Colombia, this location exposes it to earthquakes, floods, landslides and volcanic eruptions. It is located in the intertropical convergence zone (ITCZ) and presents a climate with a bimodal rainfall regime (Cortés, 2010). Its mean annual rainfall is 2000 mm, one may observe precipitation 70% of the days over a year. This rain which favors the formation of large masses of clouds and the presence of macroclimatic phenomenon as "El Niño South Oscillation", has historically caused great impacts in the region (Vélez et al, 2012). For example the geographical location coupled with rain events results in a high risk of landslides in the city. Manizales has a hydrometeorological network of 40 stations that measure and transmit data of up to eight climate variables. Some of these stations keep 10 years of historical data. However, until now this information has not been used for space-time classification of precipitation events, nor has the meteorological variables that influence them been thoroughly researched. The purpose of this study was to classify historical events of rain in an urban area of Manizales and investigate patterns of atmospheric behavior that influence or trigger such events. Classification of events was performed by calculating the "n" index of the heavy rainfall, describing the behavior of precipitation as a function of time throughout the event (Monjo, 2009). The analysis of meteorological variables was performed using statistical quantification over variable time periods before each event. The proposed classification allowed for an analysis of the evolution of rainfall events. Specially, it helped to look for the influence of different meteorological variables triggering rainfall events in hazardous areas as the city of Manizales.

  19. Random-Effects Meta-Analysis of Time-to-Event Data Using the Expectation-Maximisation Algorithm and Shrinkage Estimators

    ERIC Educational Resources Information Center

    Simmonds, Mark C.; Higgins, Julian P. T.; Stewart, Lesley A.

    2013-01-01

    Meta-analysis of time-to-event data has proved difficult in the past because consistent summary statistics often cannot be extracted from published results. The use of individual patient data allows for the re-analysis of each study in a consistent fashion and thus makes meta-analysis of time-to-event data feasible. Time-to-event data can be…

  20. Service-Learning and Graduation: Evidence from Event History Analysis

    ERIC Educational Resources Information Center

    Yue, Hongtao; Hart, Steven M.

    2017-01-01

    This research employed Event History Analysis to understand how service-learning participation is related to students' graduation within six years. The longitudinal dataset includes 31,074 new undergraduate students who enrolled in a large western U.S. public university from Fall 2002 to Fall 2009. The study revealed that service-learning…

  1. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  2. Defining Human Failure Events for Petroleum Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  3. The January 2001, El Salvador event: a multi-data analysis

    NASA Astrophysics Data System (ADS)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  4. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    PubMed

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Detection of Abnormal Events via Optical Flow Feature Analysis

    PubMed Central

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  6. Identifying causes of adverse events detected by an automated trigger tool through in-depth analysis.

    PubMed

    Muething, S E; Conway, P H; Kloppenborg, E; Lesko, A; Schoettker, P J; Seid, M; Kotagal, U

    2010-10-01

    To describe how in-depth analysis of adverse events can reveal underlying causes. Triggers for adverse events were developed using the hospital's computerised medical record (naloxone for opiate-related oversedation and administration of a glucose bolus while on insulin for insulin-related hypoglycaemia). Triggers were identified daily. Based on information from the medical record and interviews, a subject expert determined if an adverse drug event had occurred and then conducted a real-time analysis to identify event characteristics. Expert groups, consisting of frontline staff and specialist physicians, examined event characteristics and determined the apparent cause. 30 insulin-related hypoglycaemia events and 34 opiate-related oversedation events were identified by the triggers over 16 and 21 months, respectively. In the opinion of the experts, patients receiving continuous-infusion insulin and those receiving dextrose only via parenteral nutrition were at increased risk for insulin-related hypoglycaemia. Lack of standardisation in insulin-dosing decisions and variation regarding when and how much to adjust insulin doses in response to changing glucose levels were identified as common causes of the adverse events. Opiate-related oversedation events often occurred within 48 h of surgery. Variation in pain management in the operating room and post-anaesthesia care unit was identified by the experts as potential causes. Variations in practice, multiple services writing orders, multidrug regimens and variations in interpretation of patient assessments were also noted as potential contributing causes. Identification of adverse drug events through an automated trigger system, supplemented by in-depth analysis, can help identify targets for intervention and improvement.

  7. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  8. Formal Analysis of Key Integrity in PKCS#11

    NASA Astrophysics Data System (ADS)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  9. Alternative splicing and trans-splicing events revealed by analysis of the Bombyx mori transcriptome

    PubMed Central

    Shao, Wei; Zhao, Qiong-Yi; Wang, Xiu-Ye; Xu, Xin-Yan; Tang, Qing; Li, Muwang; Li, Xuan; Xu, Yong-Zhen

    2012-01-01

    Alternative splicing and trans-splicing events have not been systematically studied in the silkworm Bombyx mori. Here, the silkworm transcriptome was analyzed by RNA-seq. We identified 320 novel genes, modified 1140 gene models, and found thousands of alternative splicing and 58 trans-splicing events. Studies of three SR proteins show that both their alternative splicing patterns and mRNA products are conserved from insect to human, and one isoform of Srsf6 with a retained intron is expressed sex-specifically in silkworm gonads. Trans-splicing of mod(mdg4) in silkworm was experimentally confirmed. We identified integrations from a common 5′-gene with 46 newly identified alternative 3′-exons that are located on both DNA strands over a 500-kb region. Other trans-splicing events in B. mori were predicted by bioinformatic analysis, in which 12 events were confirmed by RT-PCR, six events were further validated by chimeric SNPs, and two events were confirmed by allele-specific RT-PCR in F1 hybrids from distinct silkworm lines of JS and L10, indicating that trans-splicing is more widespread in insects than previously thought. Analysis of the B. mori transcriptome by RNA-seq provides valuable information of regulatory alternative splicing events. The conservation of splicing events across species and newly identified trans-splicing events suggest that B. mori is a good model for future studies. PMID:22627775

  10. Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models

    PubMed Central

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.

    2015-01-01

    Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405

  11. Waves associated to COMPLEX EVENTS observed by STEREO

    NASA Astrophysics Data System (ADS)

    Siu Tapia, A. L.; Blanco-Cano, X.; Kajdic, P.; Aguilar-Rodriguez, E.; Russell, C. T.; Jian, L. K.; Luhmann, J. G.

    2012-12-01

    Complex Events are formed by two or more large-scale solar wind structures which interact in space. Typical cases are interactions of: (i) a Magnetic Cloud/Interplanetary Coronal Mass Ejection (MC/ICME) with another MC/ICME transient; and (ii) an ICME followed by a Stream Interaction Region (SIR). Complex Events are of importance for space weather studies and studying them can enhance our understanding of collisionless plasma physics. Some of these structures can produce or enhance southward magnetic fields, a key factor in geomagnetic storm generation. Using data from the STEREO mission during the years 2006-2011, we found 17 Complex Events preceded by a shock wave. We use magnetic field and plasma data to study the micro-scale structure of the shocks, and the waves associated to these shocks and within Complex Events structures. To determine wave characteristics we perform Power Spectra and Minimum Variance Analysis. We also use PLASTIC WAP protons data to study foreshock extensions and the relationship between Complex Regions and particle acceleration to suprathermal energies.

  12. Discovering anomalous events from urban informatics data

    NASA Astrophysics Data System (ADS)

    Jayarajah, Kasthuri; Subbaraju, Vigneshwaran; Weerakoon, Dulanga; Misra, Archan; Tam, La Thanh; Athaide, Noel

    2017-05-01

    Singapore's "smart city" agenda is driving the government to provide public access to a broader variety of urban informatics sources, such as images from traffic cameras and information about buses servicing different bus stops. Such informatics data serves as probes of evolving conditions at different spatiotemporal scales. This paper explores how such multi-modal informatics data can be used to establish the normal operating conditions at different city locations, and then apply appropriate outlier-based analysis techniques to identify anomalous events at these selected locations. We will introduce the overall architecture of sociophysical analytics, where such infrastructural data sources can be combined with social media analytics to not only detect such anomalous events, but also localize and explain them. Using the annual Formula-1 race as our candidate event, we demonstrate a key difference between the discriminative capabilities of different sensing modes: while social media streams provide discriminative signals during or prior to the occurrence of such an event, urban informatics data can often reveal patterns that have higher persistence, including before and after the event. In particular, we shall demonstrate how combining data from (i) publicly available Tweets, (ii) crowd levels aboard buses, and (iii) traffic cameras can help identify the Formula-1 driven anomalies, across different spatiotemporal boundaries.

  13. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    PubMed

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born <37 weeks gestational age are no longer at increased risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  14. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    PubMed

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  15. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  16. Geohazard assessment through the analysis of historical alluvial events in Southern Italy

    NASA Astrophysics Data System (ADS)

    Esposito, Eliana; Violante, Crescenzo

    2015-04-01

    The risk associated with extreme water events such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood events over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow events have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm events typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic events repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.

  17. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  18. Analysis of low-frequency seismic signals generated during a multiple-iceberg calving event at Jakobshavn Isbræ, Greenland

    USGS Publications Warehouse

    Walter, Fabian; Amundson, Jason M.; O'Neel, Shad; Truffer, Martin; Fahnestock, Mark; Fricker, Helen A.

    2012-01-01

    We investigated seismic signals generated during a large-scale, multiple iceberg calving event that occurred at Jakobshavn Isbræ, Greenland, on 21 August 2009. The event was recorded by a high-rate time-lapse camera and five broadband seismic stations located within a few hundred kilometers of the terminus. During the event two full-glacier-thickness icebergs calved from the grounded (or nearly grounded) terminus and immediately capsized; the second iceberg to calve was two to three times smaller than the first. The individual calving and capsize events were well-correlated with the radiation of low-frequency seismic signals (<0.1 Hz) dominated by Love and Rayleigh waves. In agreement with regional records from previously published ‘glacial earthquakes’, these low-frequency seismic signals had maximum power and/or signal-to-noise ratios in the 0.05–0.1 Hz band. Similarly, full waveform inversions indicate that these signals were also generated by horizontal single forces acting at the glacier terminus. The signals therefore appear to be local manifestations of glacial earthquakes, although the magnitudes of the signals (twice-time integrated force histories) were considerably smaller than previously reported glacial earthquakes. We thus speculate that such earthquakes may be a common, if not pervasive, feature of all full-glacier-thickness calving events from grounded termini. Finally, a key result from our study is that waveform inversions performed on low-frequency, calving-generated seismic signals may have only limited ability to quantitatively estimate mass losses from calving. In particular, the choice of source time function has little impact on the inversion but dramatically changes the earthquake magnitude. Accordingly, in our analysis, it is unclear whether the smaller or larger of the two calving icebergs generated a larger seismic signal.

  19. Key Design Elements of a Data Utility for National Biosurveillance: Event-driven Architecture, Caching, and Web Service Model

    PubMed Central

    Tsui, Fu-Chiang; Espino, Jeremy U.; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M.

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance—systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions. PMID:16779138

  20. Key design elements of a data utility for national biosurveillance: event-driven architecture, caching, and Web service model.

    PubMed

    Tsui, Fu-Chiang; Espino, Jeremy U; Weng, Yan; Choudary, Arvinder; Su, Hoah-Der; Wagner, Michael M

    2005-01-01

    The National Retail Data Monitor (NRDM) has monitored over-the-counter (OTC) medication sales in the United States since December 2002. The NRDM collects data from over 18,600 retail stores and processes over 0.6 million sales records per day. This paper describes key architectural features that we have found necessary for a data utility component in a national biosurveillance system. These elements include event-driven architecture to provide analyses of data in near real time, multiple levels of caching to improve query response time, high availability through the use of clustered servers, scalable data storage through the use of storage area networks and a web-service function for interoperation with affiliated systems. The methods and architectural principles are relevant to the design of any production data utility for public health surveillance-systems that collect data from multiple sources in near real time for use by analytic programs and user interfaces that have substantial requirements for time-series data aggregated in multiple dimensions.

  1. Predicting analysis time in events-driven clinical trials using accumulating time-to-event surrogate information.

    PubMed

    Wang, Jianming; Ke, Chunlei; Yu, Zhinuan; Fu, Lei; Dornseif, Bruce

    2016-05-01

    For clinical trials with time-to-event endpoints, predicting the accrual of the events of interest with precision is critical in determining the timing of interim and final analyses. For example, overall survival (OS) is often chosen as the primary efficacy endpoint in oncology studies, with planned interim and final analyses at a pre-specified number of deaths. Often, correlated surrogate information, such as time-to-progression (TTP) and progression-free survival, are also collected as secondary efficacy endpoints. It would be appealing to borrow strength from the surrogate information to improve the precision of the analysis time prediction. Currently available methods in the literature for predicting analysis timings do not consider utilizing the surrogate information. In this article, using OS and TTP as an example, a general parametric model for OS and TTP is proposed, with the assumption that disease progression could change the course of the overall survival. Progression-free survival, related both to OS and TTP, will be handled separately, as it can be derived from OS and TTP. The authors seek to develop a prediction procedure using a Bayesian method and provide detailed implementation strategies under certain assumptions. Simulations are performed to evaluate the performance of the proposed method. An application to a real study is also provided. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  3. Two Point Autocorrelation Analysis of Auger Highest Energy Events Backtracked in Galactic Magnetic Field

    NASA Astrophysics Data System (ADS)

    Petrov, Yevgeniy

    2009-10-01

    Searches for sources of the highest-energy cosmic rays traditionally have included looking for clusters of event arrival directions on the sky. The smallest cluster is a pair of events falling within some angular window. In contrast to the standard two point (2-pt) autocorrelation analysis, this work takes into account influence of the galactic magnetic field (GMF). The highest energy events, those above 50EeV, collected by the surface detector of the Pierre Auger Observatory between January 1, 2004 and May 31, 2009 are used in the analysis. Having assumed protons as primaries, events are backtracked through BSS/S, BSS/A, ASS/S and ASS/A versions of Harari-Mollerach-Roulet (HMR) model of the GMF. For each version of the model, a 2-pt autocorrelation analysis is applied to the backtracked events and to 105 isotropic Monte Carlo realizations weighted by the Auger exposure. Scans in energy, separation angular window and different model parameters reveal clustering at different angular scales. Small angle clustering at 2-3 deg is particularly interesting and it is compared between different field scenarios. The strength of the autocorrelation signal at those angular scales differs between BSS and ASS versions of the HMR model. The BSS versions of the model tend to defocus protons as they arrive to Earth whereas for the ASS, in contrary, it is more likely to focus them.

  4. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.

    2016-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less

  5. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    DOE PAGES

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...

    2017-01-24

    We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less

  6. Asymmetry of perceived key movement in chorale sequences: converging evidence from a probe-tone analysis.

    PubMed

    Cuddy, L L; Thompson, W F

    1992-01-01

    In a probe-tone experiment, two groups of listeners--one trained, the other untrained, in traditional music theory--rated the goodness of fit of each of the 12 notes of the chromatic scale to four-voice harmonic sequences. Sequences were 12 simplified excerpts from Bach chorales, 4 nonmodulating, and 8 modulating. Modulations occurred either one or two steps in either the clockwise or the counterclockwise direction on the cycle of fifths. A consistent pattern of probe-tone ratings was obtained for each sequence, with no significant differences between listener groups. Two methods of analysis (Fourier analysis and regression analysis) revealed a directional asymmetry in the perceived key movement conveyed by modulating sequences. For a given modulation distance, modulations in the counterclockwise direction effected a clearer shift in tonal organization toward the final key than did clockwise modulations. The nature of the directional asymmetry was consistent with results reported for identification and rating of key change in the sequences (Thompson & Cuddy, 1989a). Further, according to the multiple-regression analysis, probe-tone ratings did not merely reflect the distribution of tones in the sequence. Rather, ratings were sensitive to the temporal structure of the tonal organization in the sequence.

  7. What can we learn from the deadly flash floods? Post Event Review Capability (PERC) analysis of the Bavaria and Baden-Wurttemberg flood events in Summer 2016

    NASA Astrophysics Data System (ADS)

    Szoenyi, Michael

    2017-04-01

    In May/June 2016, stationary low pressure systems brought intense rainfall with record-braking intensities of well above 100 mm rain in few hours locally in the southern states of Baden-Wurttemberg and Bavaria, Germany. In steep terrains, small channels and creeks became devastating torrents impacting, among others, the villages of Simbach/Inn, Schwäbisch-Gmünd and Braunsbach. Just few days prior, France had also seen devastating rainfall and flooding. Damage in Germany alone is estimated at 2.8 M USD, of which less than 50% are insured. The loss of life was significant, with 18 fatalities reported across the events. This new forensic event analysis as part of Zurich's Post Event Review Capability (PERC) investigates the flash flood events following these record rainfalls in Southern Germany and tries to answer the following questions holistically, across the five capitals (5C) and the full disaster risk management (DRM) cycle, which are key to understanding how to become more resilient to such flood events: - Why have these intense rainfall events led to such devastating consequences? The EU Floods directive and its implementation in the various member states, as well as the 2002 and 2013 Germany floods, have focused on larger rivers and the main asset concentration. The pathway and mechanism of the 2016 floods are very different and need to be better understood. Flash floods and surface flooding may need to become the new focus and be much better communicated to people at risk, as the awareness for such perils has been identified as low. - How can the prevalence for such flash floods be better identified and mapped? Research indicated that affected people and decision makers alike attribute the occurrence of such flash floods as arbitrary, but we argue that hotspots can and must be identified based on an overlay of rainfall intensity maps, topography leading to flash flood processes, and vulnerable assets. In Germany, there are currently no comprehensive hazard

  8. Fault Tree Analysis: An Operations Research Tool for Identifying and Reducing Undesired Events in Training.

    ERIC Educational Resources Information Center

    Barker, Bruce O.; Petersen, Paul D.

    This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…

  9. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series.

    PubMed

    Lilly, Jonathan M

    2017-04-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized 'events'. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event's 'region of influence' within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis , is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry.

  10. Requirements analysis for a hardware, discrete-event, simulation engine accelerator

    NASA Astrophysics Data System (ADS)

    Taylor, Paul J., Jr.

    1991-12-01

    An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.

  11. Rule-Based Event Processing and Reaction Rules

    NASA Astrophysics Data System (ADS)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  12. Multi-spacecraft solar energetic particle analysis of FERMI gamma-ray flare events within the HESPERIA H2020 project

    NASA Astrophysics Data System (ADS)

    Tziotziou, Kostas; Malandraki, Olga; Valtonen, Eino; Heber, Bernd; Zucca, Pietro; Klein, Karl-Ludwig; Vainio, Rami; Tsiropoula, Georgia; Share, Gerald

    2017-04-01

    Multi-spacecraft observations of solar energetic particle (SEP) events are important for understanding the acceleration processes and the interplanetary propagation of particles released during eruptive events. In this work, we have carefully studied 25 gamma-ray flare events observed by FERMI and investigated possible associations with SEP-related events observed with STEREO and L1 spacecraft in the heliosphere. A data-driven velocity dispersion analysis (VDA) and Time-Shifting Analysis (TSA) are used for deriving the release times of protons and electrons at the Sun and for comparing them with the respective times stemming from the gamma-ray event analysis and their X-ray signatures, in an attempt to interconnect the SEPs and Fermi events and better understand the physics involved. Acknowledgements: This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  13. Modeling time-to-event (survival) data using classification tree analysis.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  14. Mining key elements for severe convection prediction based on CNN

    NASA Astrophysics Data System (ADS)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with

  15. Uncovering key patterns in self-harm in adolescents: Sequence analysis using the Card Sort Task for Self-harm (CaTS).

    PubMed

    Townsend, E; Wadman, R; Sayal, K; Armstrong, M; Harroe, C; Majumder, P; Vostanis, P; Clarke, D

    2016-12-01

    Self-harm is a significant clinical issue in adolescence. There is little research on the interplay of key factors in the months, weeks, days and hours leading to self-harm. We developed the Card Sort Task for Self-harm (CaTS) to investigate the pattern of thoughts, feelings, events and behaviours leading to self-harm. Forty-five young people (aged 13-21 years) with recent repeated self-harm completed the CaTS to describe their first ever/most recent self-harm episode. Lag sequential analysis determined significant transitions in factors leading to self-harm (presented in state transition diagrams). A significant sequential structure to the card sequences produced was observed demonstrating similarities and important differences in antecedents to first and most recent self-harm. Life-events were distal in the self-harm pathway and more heterogeneous. Of significant clinical concern was that the wish to die and hopelessness emerged as important antecedents in the most recent episode. First ever self-harm was associated with feeling better afterward, but this disappeared for the most recent episode. Larger sample sizes are necessary to examine longer chains of sequences and differences in genders, age and type of self-harm. The sample was self-selected with 53% having experience of living in care. The CaTs offers a systematic approach to understanding the dynamic interplay of factors that lead to self-harm in young people. It offers a method to target key points for intervention in the self-harm pathway. Crucially the factors most proximal to self-harm (negative emotions, impulsivity and access to means) are modifiable with existing clinical interventions. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  17. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  18. Geophysical Hazards and Preventive Disaster Management of Extreme Natural Events

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Takeuchi, K.

    2007-12-01

    Geophysical hazard is potentially damaging natural event and/or phenomenon, which may cause the loss of life or injury, property damage, social and economic disruption, or environmental degradation. Extreme natural hazards are a key manifestation of the complex hierarchical nonlinear Earth system. An understanding, accurate modeling and forecasting of the extreme hazards are most important scientific challenges. Several recent extreme natural events (e.g., 2004 Great Indian Ocean Earthquake and Tsunami and the 2005 violent Katrina hurricane) demonstrated strong coupling between solid Earth and ocean, and ocean and atmosphere. These events resulted in great humanitarian tragedies because of a weak preventive disaster management. The less often natural events occur (and the extreme events are rare by definition), the more often the disaster managers postpone the preparedness to the events. The tendency to reduce the funding for preventive disaster management of natural catastrophes is seldom follows the rules of responsible stewardship for future generations neither in developing countries nor in highly developed economies where it must be considered next to malfeasance. Protecting human life and property against earthquake disasters requires an uninterrupted chain of tasks: from (i) understanding of physics of the events, analysis and monitoring, through (ii) interpretation, modeling, hazard assessment, and prediction, to (iii) public awareness, preparedness, and preventive disaster management.

  19. Transcriptome and metabolome of synthetic Solanum autotetraploids reveal key genomic stress events following polyploidization.

    PubMed

    Fasano, Carlo; Diretto, Gianfranco; Aversano, Riccardo; D'Agostino, Nunzio; Di Matteo, Antonio; Frusciante, Luigi; Giuliano, Giovanni; Carputo, Domenico

    2016-06-01

    Polyploids are generally classified as autopolyploids, derived from a single species, and allopolyploids, arising from interspecific hybridization. The former represent ideal materials with which to study the consequences of genome doubling and ascertain whether there are molecular and functional rules operating following polyploidization events. To investigate whether the effects of autopolyploidization are common to different species, or if species-specific or stochastic events are prevalent, we performed a comprehensive transcriptomic and metabolomic characterization of diploids and autotetraploids of Solanum commersonii and Solanum bulbocastanum. Autopolyploidization remodelled the transcriptome and the metabolome of both species. In S. commersonii, differentially expressed genes (DEGs) were highly enriched in pericentromeric regions. Most changes were stochastic, suggesting a strong genotypic response. However, a set of robustly regulated transcripts and metabolites was also detected, including purine bases and nucleosides, which are likely to underlie a common response to polyploidization. We hypothesize that autopolyploidization results in nucleotide pool imbalance, which in turn triggers a genomic shock responsible for the stochastic events observed. The more extensive genomic stress and the higher number of stochastic events observed in S. commersonii with respect to S. bulbocastanum could be the result of the higher nucleoside depletion observed in this species. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  20. Ramadan fasting is not usually associated with the risk of cardiovascular events: A systematic review and meta-analysis

    PubMed Central

    Turin, Tanvir C.; Ahmed, Salim; Shommu, Nusrat S.; Afzal, Arfan R.; Al Mamun, Mohammad; Qasqas, Mahdi; Rumana, Nahid; Vaska, Marcus; Berka, Noureddine

    2016-01-01

    Over one billion Muslims worldwide fast during the month of Ramadan. Ramadan fasting brings about some changes in the daily lives of practicing Muslims, especially in their diet and sleep patterns, which are associated with the risk of cardiovascular diseases. Over the years, many original studies have made the effort to identify the possible impact of the Ramadan fast on cardiovascular diseases. This systematic review and meta-analysis is an attempt to present the summary of key findings from those articles and an appraisal of selected literature. A systematic search using keywords of “;Ramadan fasting” and “;cardiovascular diseases” was conducted in primary research article and gray-literature repositories, in combination with hand searching and snow balling. Fifteen studies were finally selected for data extraction on the outcomes of stroke, myocardial infarction, and congestive heart failure. The analysis revealed that the incidence of cardiovascular events during the Ramadan fast was similar to the nonfasting period. Ramadan fast is not associated with any change in incidence of acute cardiovascular disease. PMID:27186152

  1. Impact of including or excluding both-armed zero-event studies on using standard meta-analysis methods for rare event outcome: a simulation study

    PubMed Central

    Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana

    2016-01-01

    Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment

  2. Survey of critical failure events in on-chip interconnect by fault tree analysis

    NASA Astrophysics Data System (ADS)

    Yokogawa, Shinji; Kunii, Kyousuke

    2018-07-01

    In this paper, a framework based on reliability physics is proposed for adopting fault tree analysis (FTA) to the on-chip interconnect system of a semiconductor. By integrating expert knowledge and experience regarding the possibilities of failure on basic events, critical issues of on-chip interconnect reliability will be evaluated by FTA. In particular, FTA is used to identify the minimal cut sets with high risk priority. Critical events affecting the on-chip interconnect reliability are identified and discussed from the viewpoint of long-term reliability assessment. The moisture impact is evaluated as an external event.

  3. Characterization Of Dissolved Organic Mattter In The Florida Keys Ecosystem

    NASA Astrophysics Data System (ADS)

    Adams, D. G.; Shank, G. C.

    2009-12-01

    Over the past few decades, Scleractinian coral populations in the Florida Keys have increasingly experienced mortality due to bleaching events as well as microbial mediated illnesses such as black band and white band disease. Such pathologies seem to be most correlated with elevated sea surface temperatures, increased UV exposures, and shifts in the microbial community living on the coral itself. Recent studies indicate that corals’ exposure to UV in the Florida Keys is primarily controlled by the concentration of CDOM (Chromophoric Dissolved Organic Matter) in the water column. Further, microbial community alterations may be linked to changes in concentration and chemical composition of the larger DOM (Dissolved Organic Matter) pool. Our research characterized the spatial and temporal properties of DOM in Florida Bay and along the Keys ecosystems using DOC analyses, in-situ water column optical measurements, and spectral analyses including absorbance and fluorescence measurements. We analyzed DOM characteristics along transects running from the mouth of the Shark River at the southwest base of the Everglades, through Florida Bay, and along near-shore Keys coastal waters. Two 12 hour time-series samplings were also performed at the Seven-Mile Bridge, the primary Florida Bay discharge channel to the lower Keys region. Photo-bleaching experiments showed that the chemical characteristics of the DOM pool are altered by exposure to solar radiation. Results also show that DOC (~0.8-5.8 mg C/L) and CDOM (~0.5-16.5 absorbance coefficient at 305nm) concentrations exhibit seasonal fluctuations in our study region. EEM analyses suggest seasonal transitions between primarily marine (summer) and terrestrial (winter) sources along the Keys. We are currently combining EEM-PARAFAC analysis with in-situ optical measurements to model changes in the spectral properties of DOM in the water column. Additionally, we are using stable δ13C isotopic analysis to further characterize DOM

  4. Systematic identification and analysis of frequent gene fusion events in metabolic pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.

    Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less

  5. Systematic identification and analysis of frequent gene fusion events in metabolic pathways

    DOE PAGES

    Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.; ...

    2016-06-24

    Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less

  6. Recognition of maximum flooding events in mixed siliciclastic-carbonate systems: Key to global chronostratigraphic correlation

    USGS Publications Warehouse

    Mancini, E.A.; Tew, B.H.

    1997-01-01

    The maximum flooding event within a depositional sequence is an important datum for correlation because it represents a virtually synchronous horizon. This event is typically recognized by a distinctive physical surface and/or a significant change in microfossil assemblages (relative fossil abundance peaks) in siliciclastic deposits from shoreline to continental slope environments in a passive margin setting. Recognition of maximum flooding events in mixed siliciclastic-carbonate sediments is more complicated because the entire section usually represents deposition in continental shelf environments with varying rates of biologic and carbonate productivity versus siliciclastic influx. Hence, this event cannot be consistently identified simply by relative fossil abundance peaks. Factors such as siliciclastic input, carbonate productivity, sediment accumulation rates, and paleoenvironmental conditions dramatically affect the relative abundances of microfossils. Failure to recognize these complications can lead to a sequence stratigraphic interpretation that substantially overestimates the number of depositional sequences of 1 to 10 m.y. duration.

  7. Diagnostic evaluation of distributed physically based model at the REW scale (THREW) using rainfall-runoff event analysis

    NASA Astrophysics Data System (ADS)

    Tian, F.; Sivapalan, M.; Li, H.; Hu, H.

    2007-12-01

    The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of

  8. Assessing and quantifying changes in precipitation patterns using event-driven analysis

    USDA-ARS?s Scientific Manuscript database

    Studies have claimed that climate change may adversely affect precipitation patterns by increasing the occurrence of extreme events. The effects of climate change on precipitation is expected to take place over a long period of time and will require long-term data to demonstrate. Frequency analysis ...

  9. Sources of Infrasound events listed in IDC Reviewed Event Bulletin

    NASA Astrophysics Data System (ADS)

    Bittner, Paulina; Polich, Paul; Gore, Jane; Ali, Sherif; Medinskaya, Tatiana; Mialle, Pierrick

    2017-04-01

    Until 2003 two waveform technologies, i.e. seismic and hydroacoustic were used to detect and locate events included in the International Data Centre (IDC) Reviewed Event Bulletin (REB). The first atmospheric event was published in the REB in 2003, however automatic processing required significant improvements to reduce the number of false events. In the beginning of 2010 the infrasound technology was reintroduced to the IDC operations and has contributed to both automatic and reviewed IDC bulletins. The primary contribution of infrasound technology is to detect atmospheric events. These events may also be observed at seismic stations, which will significantly improve event location. Examples sources of REB events, which were detected by the International Monitoring System (IMS) infrasound network were fireballs (e.g. Bangkok fireball, 2015), volcanic eruptions (e.g. Calbuco, Chile 2015) and large surface explosions (e.g. Tjanjin, China 2015). Query blasts (e.g. Zheleznogorsk) and large earthquakes (e.g. Italy 2016) belong to events primarily recorded at seismic stations of the IMS network but often detected at the infrasound stations. In case of earthquakes analysis of infrasound signals may help to estimate the area affected by ground vibration. Infrasound associations to query blast events may help to obtain better source location. The role of IDC analysts is to verify and improve location of events detected by the automatic system and to add events which were missed in the automatic process. Open source materials may help to identify nature of some events. Well recorded examples may be added to the Reference Infrasound Event Database to help in analysis process. This presentation will provide examples of events generated by different sources which were included in the IDC bulletins.

  10. A Hierarchical Convolutional Neural Network for vesicle fusion event classification.

    PubMed

    Li, Haohan; Mao, Yunxiang; Yin, Zhaozheng; Xu, Yingke

    2017-09-01

    Quantitative analysis of vesicle exocytosis and classification of different modes of vesicle fusion from the fluorescence microscopy are of primary importance for biomedical researches. In this paper, we propose a novel Hierarchical Convolutional Neural Network (HCNN) method to automatically identify vesicle fusion events in time-lapse Total Internal Reflection Fluorescence Microscopy (TIRFM) image sequences. Firstly, a detection and tracking method is developed to extract image patch sequences containing potential fusion events. Then, a Gaussian Mixture Model (GMM) is applied on each image patch of the patch sequence with outliers rejected for robust Gaussian fitting. By utilizing the high-level time-series intensity change features introduced by GMM and the visual appearance features embedded in some key moments of the fusion process, the proposed HCNN architecture is able to classify each candidate patch sequence into three classes: full fusion event, partial fusion event and non-fusion event. Finally, we validate the performance of our method on 9 challenging datasets that have been annotated by cell biologists, and our method achieves better performances when comparing with three previous methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Life stress events and alcohol misuse: distinguishing contributing stress events from consequential stress events.

    PubMed

    Hart, Kenneth E; Fazaa, Norman

    2004-07-01

    This study examined the relationship between life stress events and level of alcohol misuse using two stress indices. The first index consisted of stress events that are not likely to be caused by alcohol misuse (i.e., alcohol uncontaminated stress events). The second stress index consisted of items that were judged as being likely consequences of alcohol misuse (i.e., alcohol contaminated stress events). Results based on a questionnaire study of 378 undergraduates in 2000 showed that level of alcohol misuse was much more strongly related to alcohol contaminated life stress events than alcohol uncontaminated life events. Comparative analysis of the coefficients of determination indicated the effect size of the association to alcohol contaminated life stress events was 240% larger than the corresponding effect size for the association to alcohol uncontaminated life events. Results suggest that studies, which are tests of the tension reduction hypothesis, should employ greater methodological rigor to ensure measures of life stress events are not inadvertently assessing the consequences of alcohol misuse. The results highlight the need to distinguish between stressful life events that contribute to alcohol misuse and stressful life events that are consequential to alcohol misuse.

  12. Incompleteness and limit of security theory of quantum key distribution

    NASA Astrophysics Data System (ADS)

    Hirota, Osamu; Murakami, Dan; Kato, Kentaro; Futami, Fumio

    2012-10-01

    It is claimed in the many papers that a trace distance: d guarantees the universal composition security in quantum key distribution (QKD) like BB84 protocol. In this introduction paper, at first, it is explicitly explained what is the main misconception in the claim of the unconditional security for QKD theory. In general terms, the cause of the misunderstanding on the security claim is the Lemma in the paper of Renner. It suggests that the generation of the perfect random key is assured by the probability (1-d), and its failure probability is d. Thus, it concludes that the generated key provides the perfect random key sequence when the protocol is success. So the QKD provides perfect secrecy to the one time pad. This is the reason for the composition claim. However, the quantity of the trace distance (or variational distance) is not the probability for such an event. If d is not small enough, always the generated key sequence is not uniform. Now one needs the reconstruction of the evaluation of the trace distance if one wants to use it. One should first go back to the indistinguishability theory in the computational complexity based, and to clarify the meaning of the value of the variational distance. In addition, the same analysis for the information theoretic case is necessary. The recent serial papers by H.P.Yuen have given the answer on such questions. In this paper, we show more concise description of Yuen's theory, and clarify that the upper bound theories for the trace distance by Tomamichel et al and Hayashi et al are constructed by the wrong reasoning of Renner and it is unsuitable as the security analysis. Finally, we introduce a new macroscopic quantum communication to replace Q-bit QKD.

  13. [Causes of underreporting of occupational injuries and adverse events in Chile].

    PubMed

    Luengo, Carolina; Paravic, Tatiana; Valenzuela, Sandra

    2016-02-01

    Objective To describe the causes of underreporting of occupational injuries and adverse events as identified in the international literature and by key informants in the area of health and risk prevention in Chile. Methods The study uses a qualitative descriptive approach. This includes a systematized literature review that follows the SALSA method (Search, Appraisal, Synthesis and Analysis) and is in line with the PRISMA statement (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). In addition, interviews were conducted with informants in the area of health and risk prevention in Chile. Results The leading causes of underreporting of occupational injuries as described in the literature and by key informants were economic factors and ignorance. With regard to adverse events, the principal causes indicated were fear of sanctions, limited support provided by the authorities, lack of knowledge, and excessive workload. Conclusions It is important to continue working to strengthen the reporting of occupational injuries and adverse events and to implement measures aimed at minimizing factors that appear to be the leading causes of underreporting. In the case of occupational injuries, this means making sure that economic factors are not an impediment but rather an incentive to reporting. With respect to adverse events, steps should be taken to eliminate the fear of sanctions and to develop recommendations, focusing more on systemic improvements than on individuals, to promote joint learning. In both cases it will be necessary to combat ignorance through continuous, systematic training and support.

  14. Traumatic events and depressive symptoms among youth in Southwest Nigeria: a qualitative analysis.

    PubMed

    Omigbodun, Olayinka; Bakare, Kofoworola; Yusuf, Bidemi

    2008-01-01

    Traumatic experiences have dire consequences for the mental health of young persons. Despite high rates of traumatic experiences in some African cities, there are no reports for Nigerian youth. To investigate the pattern of traumatic events and their association with depressive symptoms among youth in Southwest Nigeria. This is a descriptive cross-sectional study of randomly selected youth in urban and rural schools in Southwest Nigeria. They completed self-reports on traumatic events and depressive symptoms using the Street Children's Project Questionnaire and the Youth DISC Predictive Scale (DPS). Of the 1,768 responses (88.4% response rate) entered into the analysis, 34% reported experiencing a traumatic situation. Following interpretative phenomenological analysis, 13 themes emerged. Frequently occurring traumatic events were 'road traffic accidents' (33.0%), 'sickness' (17.1%), 'lost or trapped' (11.2%) and 'armed robbery attack' (9.7%). A bad dream was described by 3.7%. Traumatic experiences were commoner in males (36.2%) than in females (31.6%) (x2 = 4.2; p = .041). Experiencing a traumatic event was associated with depressive symptoms (X2 = 37.98; p < .001), especially when the event directly affected the youth as in sexual assault or physical abuse. One-third of youth in Southwest Nigeria have described an experienced traumatic event. Road traffic accidents, armed robbery attacks, and communal disturbances depict the prevailing social environment, whereas 'bad dreams' revealed the influence of cultural beliefs. Policy makers must be aware of the social issues making an impact on the health of youth. Multi-agency interventions to improve the social environment and provide mental health services for traumatized young people are essential.

  15. Efficacy and adverse events of cold vs hot polypectomy: A meta-analysis.

    PubMed

    Fujiya, Mikihiro; Sato, Hiroki; Ueno, Nobuhiro; Sakatani, Aki; Tanaka, Kazuyuki; Dokoshi, Tatsuya; Fujibayashi, Shugo; Nomura, Yoshiki; Kashima, Shin; Gotoh, Takuma; Sasajima, Junpei; Moriichi, Kentaro; Watari, Jiro; Kohgo, Yutaka

    2016-06-21

    To compare previously reported randomized controlled studies (RCTs) of cold and hot polypectomy, we systematically reviewed and clarify the utility of cold polypectomy over hot with respect to efficacy and adverse events. A meta-analysis was conducted to evaluate the predominance of cold and hot polypectomy for removing colon polyps. Published articles and abstracts from worldwide conferences were searched using the keywords "cold polypectomy". RCTs that compared either or both the effects or adverse events of cold polypectomy with those of hot polypectomy were collected. The patients' demographics, endoscopic procedures, No. of examined lesions, lesion size, macroscopic and histologic findings, rates of incomplete resection, bleeding amount, perforation, and length of procedure were extracted from each study. A forest plot analysis was used to verify the relative strength of the effects and adverse events of each procedure. A funnel plot was generated to assess the possibility of publication bias. Ultimately, six RCTs were selected. No significant differences were noted in the average lesion size (less than 10 mm) between the cold and hot polypectomy groups in each study. Further, the rates of complete resection and adverse events, including delayed bleeding, did not differ markedly between cold and hot polypectomy. The average procedural time in the cold polypectomy group was significantly shorter than in the hot polypectomy group. Cold polypectomy is a time-saving procedure for removing small polyps with markedly similar curability and safety to hot polypectomy.

  16. Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms

    NASA Astrophysics Data System (ADS)

    Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.

    2017-12-01

    Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.

  17. Broadband analysis of landslides seismic signal : example of the Oso-Steelhead landslide and other recent events

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekstrom, G.

    2014-12-01

    Landslide failures on the scale of mountains are spectacular, dangerous, and spontaneous, making direct observations hard to obtain. Measurement of their dynamic properties during runout is a high research priority, but a logistical and technical challenge. Seismology has begun to help in several important ways. Taking advantage of broadband seismic stations, recent advances now allow: (i) the seismic detection and location of large landslides in near-real-time, even for events in very remote areas that may have remain undetected, such as the 2014 Mt La Perouse supraglacial failure in Alaska; (ii) inversion of long-period waves generated by large landslides to yield an estimate of the forces imparted by the bulk accelerating mass; (iii) inference of the landslide mass, its center-of-mass velocity over time, and its trajectory.Key questions persist, such as: What can the short-period seismic data tell us about the high-frequency impacts taking place within the granular flow and along its boundaries with the underlying bedrock? And how does this seismicity relate to the bulk acceleration of the landslide and the long-period seismicity generated by it?Our recent work on the joint analysis of short- and long-period seismic signals generated by past and recent events, such as the Bingham Canyon Mine and the Oso-Steelhead landslides, provides new insights to tackle these issues. Qualitative comparison between short-period signal features and kinematic parameters inferred from long-period surface wave inversion helps to refine interpretation of the source dynamics and to understand the different mechanisms for the origin of the short-period wave radiation. Our new results also suggest that quantitative relationships can be derived from this joint analysis, in particular between the short-period seismic signal envelope and the inferred momentum of the center-of-mass. In the future, these quantitative relationships may help to constrain and calibrate parameters used in

  18. Framing Extreme Event Attribution from the Bottom up - an Enquiry into the Social Representations of key stakeholders, of the Press and of Climate Scientists.

    NASA Astrophysics Data System (ADS)

    Vanderlinden, J. P.; Fellmer, M.; Capellini, N.; Meinke, I.; Remvikos, Y.; Bray, D.; Pacteau, C.; Von Storch, H.

    2014-12-01

    Attribution of extreme weather events has recently generated a lot of interest simultaneously within the general public, the scientific community, and stakeholders affected by meteorological extremes. This interest calls for the need to explore the potential convergence of the current atttribution science with the desire and needs of stakeholders. Such an euiry contributes to the development of climate services aiming at quantifying the human responsibility for particular events. Through interviews with climate scientists, through the analysis of the press coverage of extreme meteorological events, and through stakeholder (private sector, covernment services and local and regional government) focus groups, we analyze how social representations of the concepts associated with extreme event attribution are theorized. From the corpuses generated in the course of this enquiry, we build up a grounded, bottom-up, theorization of extreme weather event attribution. This bottom-up theorization allows for a framing of the potential climate services in a way that is attuned to the needs and expectations of the stakeholders. From apparently simple formulations: "what is an extreme event?", "what makes it extreme?", "what is meant by attribution of extreme weather events?", "what do we want to attribute?", "what is a climate service?", we demonstrate the polysemy of these terms and propose ways to address the challenges associated with the juxtaposition of four highly loaded concepts: extreme - event - attribution - climate services.

  19. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  20. Public-key quantum digital signature scheme with one-time pad private-key

    NASA Astrophysics Data System (ADS)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  1. An Empirical Analysis of the Cascade Secret Key Reconciliation Protocol for Quantum Key Distribution

    DTIC Science & Technology

    2011-09-01

    performance with the parity checks within each pass increasing and as a result, the processing time is expected to increase as well. A conclusion is drawn... timely manner has driven efforts to develop new key distribution methods. The most promising method is Quantum Key Distribution (QKD) and is...thank the QKD Project Team for all of the insight and support they provided in such a short time period. Thanks are especially in order for my

  2. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task 2 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Makarov, Yuri V.; McKinstry, Craig A.

    Task report detailing low probability tail event analysis and mitigation in BPA control area. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, causing the imbalance between generation and load to become very significant.

  3. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  4. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    NASA Astrophysics Data System (ADS)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  5. Testing key predictions of the associative account of mirror neurons in humans using multivariate pattern analysis.

    PubMed

    Oosterhof, Nikolaas N; Wiggett, Alison J; Cross, Emily S

    2014-04-01

    Cook et al. overstate the evidence supporting their associative account of mirror neurons in humans: most studies do not address a key property, action-specificity that generalizes across the visual and motor domains. Multivariate pattern analysis (MVPA) of neuroimaging data can address this concern, and we illustrate how MVPA can be used to test key predictions of their account.

  6. Time to tenure in Spanish universities: an event history analysis.

    PubMed

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  7. Time to Tenure in Spanish Universities: An Event History Analysis

    PubMed Central

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  8. Implementing recovery: an analysis of the key technologies in Scotland

    PubMed Central

    2011-01-01

    Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI), Wellness Recovery Action Planning (WRAP) and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems. PMID:21569633

  9. Sparganothis fruitworm degree-day benchmarks provide key treatmen timings for cranberry IPM

    USDA-ARS?s Scientific Manuscript database

    Degree-day benchmarks indicate discrete biological events in the development of insect pests. For the Sparganothis fruitworm, we have isolated all key development events and linked them to degree-day accumulations. These degree-day accumulations can greatly improve treatment timings for cranberry ...

  10. Tipping the Balance: Hepatotoxicity and the Four Apical Key Events of Hepatic Steatosis

    EPA Science Inventory

    Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk r...

  11. Measurement-device-independent quantum key distribution for Scarani-Acin-Ribordy-Gisin 04 protocol

    PubMed Central

    Mizutani, Akihiro; Tamaki, Kiyoshi; Ikuta, Rikizo; Yamamoto, Takashi; Imoto, Nobuyuki

    2014-01-01

    The measurement-device-independent quantum key distribution (MDI QKD) was proposed to make BB84 completely free from any side-channel in detectors. Like in prepare & measure QKD, the use of other protocols in MDI setting would be advantageous in some practical situations. In this paper, we consider SARG04 protocol in MDI setting. The prepare & measure SARG04 is proven to be able to generate a key up to two-photon emission events. In MDI setting we show that the key generation is possible from the event with single or two-photon emission by a party and single-photon emission by the other party, but the two-photon emission event by both parties cannot contribute to the key generation. On the contrary to prepare & measure SARG04 protocol where the experimental setup is exactly the same as BB84, the measurement setup for SARG04 in MDI setting cannot be the same as that for BB84 since the measurement setup for BB84 in MDI setting induces too many bit errors. To overcome this problem, we propose two alternative experimental setups, and we simulate the resulting key rate. Our study highlights the requirements that MDI QKD poses on us regarding with the implementation of a variety of QKD protocols. PMID:24913431

  12. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    NASA Astrophysics Data System (ADS)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  13. Using Pattern Recognition and Discriminance Analysis to Predict Critical Events in Large Signal Databases

    NASA Astrophysics Data System (ADS)

    Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang

    2009-09-01

    Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.

  14. Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets.

    PubMed

    Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo

    2017-08-08

    The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms.

  15. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  16. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  17. Video Traffic Analysis for Abnormal Event Detection

    DOT National Transportation Integrated Search

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...

  18. Video traffic analysis for abnormal event detection.

    DOT National Transportation Integrated Search

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...

  19. The Collaborative Heliophysics Events Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Schuler, D.; Cheung, C.

    2010-12-01

    The Collaborative Heliophysics Events Knowledgebase (CHEK) leverages and integrates the existing resources developed by HEK for SDO (Hurlburt et al. 2010) to provide a collaborative framework for heliophysics researchers. This framework will enable an environment were researches can not only identify and locate relevant data, but can deploy a social network for sharing and expanding knowledge about heliophysical events. CHEK will expand the HEK and key HEK clients into the heliosphere and geospace, and create a heliophysics social network. We describe our design and goals of the CHEK project and discuss its relation to Citizen Science in the heliosphere. Hurlburt, N et al. 2010, “A Heliophysics Event Knowledgebase for Solar Dynamics Observatory,” Sol Phys., in press

  20. Carbon Isotopes in Pinus elliotti from Big Pine Key, Florida: Indicators of Seasonal Precipitation, ENSO and Disturbance Events

    NASA Astrophysics Data System (ADS)

    Rebenack, C.; Willoughby, H. E.; Anderson, W. T.; Cherubini, P.

    2013-12-01

    , and disturbance events. Because slash pine growth is dependent on water availability, a chronology developed using carbon isotopes may provide greater insight into plant stress over time and ultimately may lead to better correlations with climate oscillations. The work presented here is the result of a carbon-isotope study of four slash pine trees from Big Pine Key, Florida. The δ13C data show seasonal stomatal activity in the trees that can be linked to regional precipitation and, to a larger extent, to the ENSO cycles. In addition, there are several anomalies in the carbon isotope record that may indicate the timing of disturbance events.

  1. Effects of Interventions on Relapse to Narcotics Addiction: An Event-History Analysis.

    ERIC Educational Resources Information Center

    Hser, Yih-Ing; And Others

    1995-01-01

    Event-history analysis was applied to the life history data of 581 male narcotics addicts to specify the concurrent, postintervention, and durational effects of social interventions on relapse to narcotics use. Results indicate the advisability of supporting methadone maintenance with other prevention strategies. (SLD)

  2. Characterization of Strombolian events by using independent component analysis

    NASA Astrophysics Data System (ADS)

    Ciaramella, A.; de Lauro, E.; de Martino, S.; di Lieto, B.; Falanga, M.; Tagliaferri, R.

    2004-10-01

    We apply Independent Component Analysis (ICA) to seismic signals recorded at Stromboli volcano. Firstly, we show how ICA works considering synthetic signals, which are generated by dynamical systems. We prove that Strombolian signals, both tremor and explosions, in the high frequency band (>0.5 Hz), are similar in time domain. This seems to give some insights to the organ pipe model generation for the source of these events. Moreover, we are able to recognize in the tremor signals a low frequency component (<0.5 Hz), with a well defined peak corresponding to 30s.

  3. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    PubMed

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. Impact of a Single Unusually Large Rainfall Event on the Level of Risk Used for Infrastructure Design

    NASA Astrophysics Data System (ADS)

    Dhakal, N.; Jain, S.

    2013-12-01

    Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.

  5. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    PubMed

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  6. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    NASA Astrophysics Data System (ADS)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  7. Analysis of warm convective rain events in Catalonia

    NASA Astrophysics Data System (ADS)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  8. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  9. Analysis of the Impact of Climate Change on Extreme Hydrological Events in California

    NASA Astrophysics Data System (ADS)

    Ashraf Vaghefi, Saeid; Abbaspour, Karim C.

    2016-04-01

    Estimating magnitude and occurrence frequency of extreme hydrological events is required for taking preventive remedial actions against the impact of climate change on the management of water resources. Examples include: characterization of extreme rainfall events to predict urban runoff, determination of river flows, and the likely severity of drought events during the design life of a water project. In recent years California has experienced its most severe drought in recorded history, causing water stress, economic loss, and an increase in wildfires. In this paper we describe development of a Climate Change Toolkit (CCT) and demonstrate its use in the analysis of dry and wet periods in California for the years 2020-2050 and compare the results with the historic period 1975-2005. CCT provides four modules to: i) manage big databases such as those of Global Climate Models (GCMs), ii) make bias correction using observed local climate data , iii) interpolate gridded climate data to finer resolution, and iv) calculate continuous dry- and wet-day periods based on rainfall, temperature, and soil moisture for analysis of drought and flooding risks. We used bias-corrected meteorological data of five GCMs for extreme CO2 emission scenario rcp8.5 for California to analyze the trend of extreme hydrological events. The findings indicate that frequency of dry period will increase in center and southern parts of California. The assessment of the number of wet days and the frequency of wet periods suggests an increased risk of flooding in north and north-western part of California, especially in the coastal strip. Keywords: Climate Change Toolkit (CCT), Extreme Hydrological Events, California

  10. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    PubMed

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  11. Statistical analysis of mixed recurrent event data with application to cancer survivor study

    PubMed Central

    Zhu, Liang; Tong, Xingwei; Zhao, Hui; Sun, Jianguo; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.

    2014-01-01

    Event history studies occur in many fields including economics, medical studies and social science. In such studies concerning some recurrent events, two types of data have been extensively discussed in the literature. One is recurrent event data that arise if study subjects are monitored or observed continuously. In this case, the observed information provides the times of all occurrences of the recurrent events of interest. The other is panel count data, which occur if the subjects are monitored or observed only periodically. This can happen if the continuous observation is too expensive or not practical and in this case, only the numbers of occurrences of the events between subsequent observation times are available. In this paper, we discuss a third type of data, which is a mixture of recurrent event and panel count data and for which there exists little literature. For regression analysis of such data, a marginal mean model is presented and we propose an estimating equation-based approach for estimation of regression parameters. A simulation study is conducted to assess the finite sample performance of the proposed methodology and indicates that it works well for practical situations. Finally it is applied to a motivating study on childhood cancer survivors. PMID:23139023

  12. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    NASA Astrophysics Data System (ADS)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  13. Uncertainty analysis in fault tree models with dependent basic events.

    PubMed

    Pedroni, Nicola; Zio, Enrico

    2013-06-01

    In general, two types of dependence need to be considered when estimating the probability of the top event (TE) of a fault tree (FT): "objective" dependence between the (random) occurrences of different basic events (BEs) in the FT and "state-of-knowledge" (epistemic) dependence between estimates of the epistemically uncertain probabilities of some BEs of the FT model. In this article, we study the effects on the TE probability of objective and epistemic dependences. The well-known Frèchet bounds and the distribution envelope determination (DEnv) method are used to model all kinds of (possibly unknown) objective and epistemic dependences, respectively. For exemplification, the analyses are carried out on a FT with six BEs. Results show that both types of dependence significantly affect the TE probability; however, the effects of epistemic dependence are likely to be overwhelmed by those of objective dependence (if present). © 2012 Society for Risk Analysis.

  14. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  15. An analysis of high-impact, low-predictive skill severe weather events in the northeast U.S

    NASA Astrophysics Data System (ADS)

    Vaughan, Matthew T.

    An objective evaluation of Storm Prediction Center slight risk convective outlooks, as well as a method to identify high-impact severe weather events with poor-predictive skill are presented in this study. The objectives are to assess severe weather forecast skill over the northeast U.S. relative to the continental U.S., build a climatology of high-impact, low-predictive skill events between 1980--2013, and investigate the dynamic and thermodynamic differences between severe weather events with low-predictive skill and high-predictive skill over the northeast U.S. Severe storm reports of hail, wind, and tornadoes are used to calculate skill scores including probability of detection (POD), false alarm ratio (FAR) and threat scores (TS) for each convective outlook. Low predictive skill events are binned into low POD (type 1) and high FAR (type 2) categories to assess temporal variability of low-predictive skill events. Type 1 events were found to occur in every year of the dataset with an average of 6 events per year. Type 2 events occur less frequently and are more common in the earlier half of the study period. An event-centered composite analysis is performed on the low-predictive skill database using the National Centers for Environmental Prediction Climate Forecast System Reanalysis 0.5° gridded dataset to analyze the dynamic and thermodynamic conditions prior to high-impact severe weather events with varying predictive skill. Deep-layer vertical shear between 1000--500 hPa is found to be a significant discriminator in slight risk forecast skill where high-impact events with less than 31-kt shear have lower threat scores than high-impact events with higher shear values. Case study analysis of type 1 events suggests the environment over which severe weather occurs is characterized by high downdraft convective available potential energy, steep low-level lapse rates, and high lifting condensation level heights that contribute to an elevated risk of severe wind.

  16. "That in your hands". A comprehensive process analysis of a significant event in psychotherapy.

    PubMed

    Elliott, R

    1983-05-01

    This article illustrates a new approach to the study of change processes in psychotherapy. The approach involves selecting significant change events and analyzing them according to the Comprehensive Process Model. In this model, client and therapist behaviors are analyzed for content, interpersonal action, style and response quality by using information derived from Interpersonal Process Recall, client and therapist objective process ratings and qualitative analyses. The event selected for analysis in this paper was rated by client and therapist as significantly helpful. The focal therapist response was a reflective-interpretive intervention in which the therapist collaboratively and evocatively expanded the client's implicit meanings. The event involved working through an earlier insight and realization of progress by the client. The event suggests an association between subjective "felt shifts" and public "process shifts" in client in-therapy behaviors. A model, consistent with Gendlin's experiential psychotherapy (1970), is offered to describe the change process which occurred in this event.

  17. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  18. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  19. The role of dose rate in radiation cancer risk: evaluating the effect of dose rate at the molecular, cellular and tissue levels using key events in critical pathways following exposure to low LET radiation

    PubMed Central

    Brooks, Antone L.; Hoel, David G.; Preston, R. Julian

    2016-01-01

    Abstract Purpose: This review evaluates the role of dose rate on cell and molecular responses. It focuses on the influence of dose rate on key events in critical pathways in the development of cancer. This approach is similar to that used by the U.S. EPA and others to evaluate risk from chemicals. It provides a mechanistic method to account for the influence of the dose rate from low-LET radiation, especially in the low-dose region on cancer risk assessment. Molecular, cellular, and tissues changes are observed in many key events and change as a function of dose rate. The magnitude and direction of change can be used to help establish an appropriate dose rate effectiveness factor (DREF). Conclusions: Extensive data on key events suggest that exposure to low dose-rates are less effective in producing changes than high dose rates. Most of these data at the molecular and cellular level support a large (2–30) DREF. In addition, some evidence suggests that doses delivered at a low dose rate decrease damage to levels below that observed in the controls. However, there are some data human and mechanistic data that support a dose-rate effectiveness factor of 1. In summary, a review of the available molecular, cellular and tissue data indicates that not only is dose rate an important variable in understanding radiation risk but it also supports the selection of a DREF greater than one as currently recommended by ICRP (2007) and BEIR VII (NRC/NAS 2006). PMID:27266588

  20. Vulnerability of global food production to extreme climatic events.

    PubMed

    Yeni, F; Alpas, H

    2017-06-01

    It is known that the frequency, intensity or duration of the extreme climatic events have been changing substantially. The ultimate goal of this study was to identify current vulnerabilities of global primary food production against extreme climatic events, and to discuss potential entry points for adaptation planning by means of an explorative vulnerability analysis. Outcomes of this analysis were demonstrated as a composite index where 118 country performances in maintaining safety of food production were compared and ranked against climate change. In order to better interpret the results, cluster analysis technique was used as a tool to group the countries based on their vulnerability index (VI) scores. Results suggested that one sixth of the countries analyzed were subject to high level of exposure (0.45-1), one third to high to very high level of sensitivity (0.41-1) and low to moderate level of adaptive capacity (0-0.59). Proper adaptation strategies for reducing the microbial and chemical contamination of food products, soil and waters on the field were proposed. Finally, availability of data on food safety management systems and occurrence of foodborne outbreaks with global coverage were proposed as key factors for improving the robustness of future vulnerability assessments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Observations on Rupture Behaviour of Fluid Induced Events at the Basel EGS Based on Empirical Green's Function Analysis

    NASA Astrophysics Data System (ADS)

    Folesky, J.; Kummerow, J.; Shapiro, S. A.; Asanuma, H.; Häring, M. O.

    2015-12-01

    The Emprirical Green's Function (EGF) method uses pairs of events of high wave form similarity and adjacent hypocenters to decompose the influences of source time function, ray path, instrument site, and instrument response. The seismogram of the smaller event is considered as the Green's Function which then can be deconvolved from the other seismogram. The result provides a reconstructed relative source time function (RSTF) of the larger event of that event pair. The comparison of the RSTFs at different stations of the observation systems produces information on the rupture process of the larger event based on the observation of the directivity effect and on changing RSTFs complexities.The Basel EGS dataset of 2006-2007 consists of about 2800 localized events of magnitudes between 0.0event pairs of adequate magnitude difference for EGF analysis. The data has sufficient quality to analyse events with magnitudes down to ML=0.5 for an apparent directivity effect although the approximate rupture duration for those events is of only a few milliseconds. The dataset shows a number of multiplets and repeating earthquakes known from earlier studies. The larger events seem to appear close to the rim of the microseismic cloud. We are interested in their rupture behaviour. Using the EGF method we compute rupture orientations for about 190 event pairs and relate them to the event location, the known fault system, and stress regime. For the majority of events we observe a similar rupture direction which seems to correlate with the over all shape of the microseismic cloud. The large events, however, point back to the injection source. Additionally the rupture direction fitting yields estimates for projections of the rupture velocity on the horizontal plane. They seem to vary between the multiplets in the reservoir from 0.3 to 0.7 times the S-wave velocity.To our knowledge source characterization by EGF analysis has not yet been introduced to microseismic reservoirs

  2. Social Network Analysis Identifies Key Participants in Conservation Development.

    PubMed

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-05-01

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  3. Fundamental finite key limits for one-way information reconciliation in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Martinez-Mateo, Jesus; Pacher, Christoph; Elkouss, David

    2017-11-01

    The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols.

  4. Improved statistical fluctuation analysis for measurement-device-independent quantum key distribution with four-intensity decoy-state method.

    PubMed

    Mao, Chen-Chen; Zhou, Xing-Yu; Zhu, Jian-Rong; Zhang, Chun-Hui; Zhang, Chun-Mei; Wang, Qin

    2018-05-14

    Recently Zhang et al [ Phys. Rev. A95, 012333 (2017)] developed a new approach to estimate the failure probability for the decoy-state BB84 QKD system when taking finite-size key effect into account, which offers security comparable to Chernoff bound, while results in an improved key rate and transmission distance. Based on Zhang et al's work, now we extend this approach to the case of the measurement-device-independent quantum key distribution (MDI-QKD), and for the first time implement it onto the four-intensity decoy-state MDI-QKD system. Moreover, through utilizing joint constraints and collective error-estimation techniques, we can obviously increase the performance of practical MDI-QKD systems compared with either three- or four-intensity decoy-state MDI-QKD using Chernoff bound analysis, and achieve much higher level security compared with those applying Gaussian approximation analysis.

  5. Device-independent secret-key-rate analysis for quantum repeaters

    NASA Astrophysics Data System (ADS)

    Holz, Timo; Kampermann, Hermann; Bruß, Dagmar

    2018-01-01

    The device-independent approach to quantum key distribution (QKD) aims to establish a secret key between two or more parties with untrusted devices, potentially under full control of a quantum adversary. The performance of a QKD protocol can be quantified by the secret key rate, which can be lower bounded via the violation of an appropriate Bell inequality in a setup with untrusted devices. We study secret key rates in the device-independent scenario for different quantum repeater setups and compare them to their device-dependent analogon. The quantum repeater setups under consideration are the original protocol by Briegel et al. [Phys. Rev. Lett. 81, 5932 (1998), 10.1103/PhysRevLett.81.5932] and the hybrid quantum repeater protocol by van Loock et al. [Phys. Rev. Lett. 96, 240501 (2006), 10.1103/PhysRevLett.96.240501]. For a given repeater scheme and a given QKD protocol, the secret key rate depends on a variety of parameters, such as the gate quality or the detector efficiency. We systematically analyze the impact of these parameters and suggest optimized strategies.

  6. FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting

    NASA Astrophysics Data System (ADS)

    Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.

    2009-10-01

    The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.

  7. The Frasnian-Famennian mass killing event(s), methods of identification and evaluation

    NASA Technical Reports Server (NTRS)

    Geldsetzer, H. H. J.

    1988-01-01

    The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.

  8. Adverse events with bismuth salts for Helicobacter pylori eradication: Systematic review and meta-analysis

    PubMed Central

    Ford, Alexander C; Malfertheiner, Peter; Giguère, Monique; Santana, José; Khan, Mostafizur; Moayyedi, Paul

    2008-01-01

    AIM: To assess the safety of bismuth used in Helicobacter pylori (H pylori) eradication therapy regimens. METHODS: We conducted a systematic review and meta-analysis. MEDLINE and EMBASE were searched (up to October 2007) to identify randomised controlled trials comparing bismuth with placebo or no treatment, or bismuth salts in combination with antibiotics as part of eradication therapy with the same dose and duration of antibiotics alone or, in combination, with acid suppression. Total numbers of adverse events were recorded. Data were pooled and expressed as relative risks with 95% confidence intervals (CI). RESULTS: We identified 35 randomised controlled trials containing 4763 patients. There were no serious adverse events occurring with bismuth therapy. There was no statistically significant difference detected in total adverse events with bismuth [relative risk (RR) = 1.01; 95% CI: 0.87-1.16], specific individual adverse events, with the exception of dark stools (RR = 5.06; 95% CI: 1.59-16.12), or adverse events leading to withdrawal of therapy (RR = 0.86; 95% CI: 0.54-1.37). CONCLUSION: Bismuth for the treatment of H pylori is safe and well-tolerated. The only adverse event occurring significantly more commonly was dark stools. PMID:19109870

  9. Bayesian analysis of caustic-crossing microlensing events

    NASA Astrophysics Data System (ADS)

    Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.

    2010-06-01

    Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.

  10. Element analysis: a wavelet-based method for analysing time-localized events in noisy time series

    PubMed Central

    2017-01-01

    A method is derived for the quantitative analysis of signals that are composed of superpositions of isolated, time-localized ‘events’. Here, these events are taken to be well represented as rescaled and phase-rotated versions of generalized Morse wavelets, a broad family of continuous analytic functions. Analysing a signal composed of replicates of such a function using another Morse wavelet allows one to directly estimate the properties of events from the values of the wavelet transform at its own maxima. The distribution of events in general power-law noise is determined in order to establish significance based on an expected false detection rate. Finally, an expression for an event’s ‘region of influence’ within the wavelet transform permits the formation of a criterion for rejecting spurious maxima due to numerical artefacts or other unsuitable events. Signals can then be reconstructed based on a small number of isolated points on the time/scale plane. This method, termed element analysis, is applied to the identification of long-lived eddy structures in ocean currents as observed by along-track measurements of sea surface elevation from satellite altimetry. PMID:28484325

  11. Low Probability Tail Event Analysis and Mitigation in BPA Control Area: Task One Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Shuai; Makarov, Yuri V.

    This is a report for task one of the tail event analysis project for BPA. Tail event refers to the situation in a power system when unfavorable forecast errors of load and wind are superposed onto fast load and wind ramps, or non-wind generators falling short of scheduled output, the imbalance between generation and load becomes very significant. This type of events occurs infrequently and appears on the tails of the distribution of system power imbalance; therefore, is referred to as tail events. This report analyzes what happened during the Electric Reliability Council of Texas (ERCOT) reliability event on Februarymore » 26, 2008, which was widely reported because of the involvement of wind generation. The objective is to identify sources of the problem, solutions to it and potential improvements that can be made to the system. Lessons learned from the analysis include the following: (1) Large mismatch between generation and load can be caused by load forecast error, wind forecast error and generation scheduling control error on traditional generators, or a combination of all of the above; (2) The capability of system balancing resources should be evaluated both in capacity (MW) and in ramp rate (MW/min), and be procured accordingly to meet both requirements. The resources need to be able to cover a range corresponding to the variability of load and wind in the system, additional to other uncertainties; (3) Unexpected ramps caused by load and wind can both become the cause leading to serious issues; (4) A look-ahead tool evaluating system balancing requirement during real-time operations and comparing that with available system resources should be very helpful to system operators in predicting the forthcoming of similar events and planning ahead; and (5) Demand response (only load reduction in ERCOT event) can effectively reduce load-generation mismatch and terminate frequency deviation in an emergency situation.« less

  12. A PDA-based system for online recording and analysis of concurrent events in complex behavioral processes.

    PubMed

    Held, Jürgen; Manser, Tanja

    2005-02-01

    This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.

  13. Key forecasts shaping nursing's perfect storm.

    PubMed

    Yoder-Wise, Patricia S

    2007-01-01

    Perfect storms abound in nursing and healthcare. How we plan for them and how we forecast effectively which ones will have tremendous impact on how we lead the profession is a challenge to anyone who is or will be a leader. This article focuses on key forecasts that contribute to creating perfect storms of the future. The "perfect storm" is a term found in multiple disciplines. The phrase denotes the condition that exists when events occur simultaneously with the result that this confluence has a greater impact than what could have resulted from a chance combination. Although perfect storms are rare, they have enormous impact when they occur, and if an alteration in any of the events occurs, the overall impact is lessened.

  14. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    ERIC Educational Resources Information Center

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  15. Identification of Key Odorants in Withering-Flavored Green Tea by Aroma Extract Dilution Analysis

    NASA Astrophysics Data System (ADS)

    Mizukami, Yuzo; Yamaguchi, Yuichi

    This research aims to identify key odorants in withering-flavored green tea. Application of the aroma extract dilution analysis using the volatile fraction of green tea and withering-flavored green tea revealed 25 and 35 odor-active peaks with the flavor dilution factors of≥4, respectively. 4-mercapto-4-methylpentan-2-one, (E)-2-nonenal, linalool, (E,Z)-2,6-nonadienal and 3-methylnonane-2,4-dione were key odorants in green tea with the flavor dilution factor of≥16. As well as these 5 odorants, 1-octen-3-one, β-damascenone, geraniol, β-ionone, (Z)-methyljasmonate, indole and coumarine contributed to the withering flavor of green tea.

  16. Discrepant Events: A Challenge to Students' Intuition

    NASA Astrophysics Data System (ADS)

    González-Espada, Wilson J.; Birriel, Jennifer; Birriel, Ignacio

    2010-11-01

    Studies on cognitive aspects of science education, especially how students achieve conceptual change, have been a focus of interest for many years. Researchers of student learning and conceptual change have developed several easily applicable teaching strategies. One of these strategies is known as discrepant events. Discrepant events are very powerful ways to stimulate interest, motivate students to challenge their covert science misconceptions, and promote higher-order thinking skills. The key point is that directly challenging students' naive ideas will lead to more quality science learning going on in the classroom. In this paper, we summarize the research-based role of discrepant events in conceptual change and we share several highly successful discrepant events we use in our own classes.

  17. Tight finite-key analysis for quantum cryptography

    PubMed Central

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-01

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies. PMID:22252558

  18. Tight finite-key analysis for quantum cryptography.

    PubMed

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  19. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  20. Composite Analysis of Cold Season Atmospheric River Events: Extreme Precipitation and Flooding over the Western United States

    NASA Astrophysics Data System (ADS)

    Eldardiry, H.; Hossain, F.

    2017-12-01

    Atmospheric Rivers (ARs) are narrow elongated corridors with horizontal water vapor transport located within the warm sector of extratropical cyclones. While it is widely known that most of heavy rainfall events across the western United States (US) are driven by ARs, the connection between atmospheric conditions and precipitation during an AR event has not been fully documented. In this study, we present a statistical analysis of the connection between precipitation, temperature, wind, and snowpack during the cold season AR events hitting the coastal regions of the western US. For each AR event, the precipitation and other atmospheric variables are retrieved through the dynamic downscaling of NCEP/NCAR Reanalysis product using the Advanced Research Weather Research and Forecasting Model (ARW-WRF). The results show a low frequency of precipitation (below 0.3) during AR events that reflects the connection of AR with extreme precipitation. Examining the horizontal wind speed during AR events indicates a high correlation (above 0.7) with precipitation. In addition, high levels of snow water equivalence (SWE) are also noticed along the mountainous regions, e.g., Cascade Range and Sierra-Nevada mountain range, during most of AR events. Addressing the impact of duration on the frequency of precipitation, we develop Intensity-Duration-Frequency (IDF) curves during AR events that can potentially describe the future predictability of precipitation along the north and south coast. To complement our analysis, we further investigate the flooding events recorded in the National Centers for Environmental Information (NCEI) storm events database. While some flooding events are attributed to heavy rainfall associated with an AR event, other flooding events are significantly connected to the increase in the snowmelt before the flooding date. Thus, we introduce an index that describes the contribution of rainfall vs snowmelt and categorizes the flooding events during an AR event

  1. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis.

    PubMed

    Schauer, Steven G; April, Michael D; Simon, Erica; Maddry, Joseph K; Carter, Robert; Delorenzo, Robert A

    2017-08-01

    Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines. Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included. Data Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as "routine." Follow-up data were available for 36 of the subjects and 97% were discharged alive. The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines. Schauer SG , April MD , Simon E , Maddry JK , Carter R III , Delorenzo RA . Prehospital interventions during mass-casualty events in Afghanistan: a case analysis. Prehosp Disaster Med. 2017;32(4):465-468.

  2. The added predictive value of biphasic events in ST analysis of the fetal electrocardiogram for intrapartum fetal monitoring.

    PubMed

    Becker, Jeroen H; Krikhaar, Anniek; Schuit, Ewoud; Mårtendal, Annika; Maršál, Karel; Kwee, Anneke; Visser, Gerard H A; Amer-Wåhlin, Isis

    2015-02-01

    To study the predictive value of biphasic ST-events for interventions for suspected fetal distress and adverse neonatal outcome, when using ST-analysis of the fetal electrocardiogram (FECG) for intrapartum fetal monitoring. Prospective cohort study. Three academic hospitals in Sweden. Women in labor with a high-risk singleton fetus in cephalic position beyond 36 weeks of gestation. In women in labor who were monitored with conventional cardiotocography, ST-waveform analysis was recorded and concealed. Traces with biphasic ST-events of the FECG (index) were compared with traces without biphasic events of the FECG. The ability of biphasic events to predict interventions for suspected fetal distress and adverse outcome was assessed using univariable and multivariable logistic regression analyses. Interventions for suspected fetal distress and adverse outcome (defined as presence of metabolic acidosis (i.e. umbilical cord pH <7.05 and base deficit in extracellular fluid >12 mmol), umbilical cord pH <7.00, 5-min Apgar score <7, admittance to neonatal intensive care unit or perinatal death). Although the presence of biphasic events of the FECG was associated with more interventions for fetal distress and an increased risk of adverse outcome compared with cases with no biphasic events, the presence of significant (i.e. intervention advised according to cardiotocography interpretation) biphasic events showed no independent association with interventions for fetal distress [odds ratio (OR) 1.71, 95% confidence interval (CI) 0.65-4.50] or adverse outcome (OR 1.96, 95% CI 0.74-5.24). The presence of significant biphasic events did not discriminate in the prediction of interventions for fetal distress or adverse outcome. Therefore, biphasic events in relation to ST-analysis monitoring during birth should be omitted if future studies confirm our findings. © 2014 Nordic Federation of Societies of Obstetrics and Gynecology.

  3. How do blockings relate to heavy precipitation events in Europe?

    NASA Astrophysics Data System (ADS)

    Lenggenhager, Sina; Romppainen, Olivia; Brönnimann, Stefan; Croci-Maspoli, Mischa

    2017-04-01

    Atmospheric blockings are quasi-stationary high pressure systems that persist for several days. Due to their longevity, blockings can be key features for extreme weather events. While several studies have shown their relevant role for temperatures extremes, the link between blockings and extreme precipitation and floods is still poorly understood. A case study of a Swiss lake flood event in the year 2000 reveals how different processes connected to blockings can favour the development of a flood. First upstream blocks helped to form strongly elongated troughs that are known to be associated with heavy precipitation events south of the Alps. Second recurrent precipitation events upstream of a block led to a moistening of the catchment and an increase of the lake level. Third the progression of the upstream weather systems was slowed and thereby the precipitation period over a catchment prolonged. Additionally, cloud diabatic processes in the flood region contributed to the establishment and maintenance of blocking anticyclones. Based on this case study we extend our analysis to all of Europe. Focusing on flood relevant precipitation events, i.e. extreme precipitation events that last for several days and affect larger areas, we show that different regions in Europe have very distinct seasonal precipitation patterns. Hence there is a strong seasonality in the occurrence of extreme events, depending on the geographical region. We further suggest that for different precipitation regimes, the preferred location of blockings varies strongly. Heavy precipitation events in southern France, for example, are often observed during Scandinavian blockings, while heavy precipitation events in south-eastern Europe coincide more often with eastern North-Atlantic blockings.

  4. The time course of symbolic number adaptation: oscillatory EEG activity and event-related potential analysis.

    PubMed

    Hsu, Yi-Fang; Szűcs, Dénes

    2012-02-15

    Several functional magnetic resonance imaging (fMRI) studies have used neural adaptation paradigms to detect anatomical locations of brain activity related to number processing. However, currently not much is known about the temporal structure of number adaptation. In the present study, we used electroencephalography (EEG) to elucidate the time course of neural events in symbolic number adaptation. The numerical distance of deviants relative to standards was manipulated. In order to avoid perceptual confounds, all levels of deviants consisted of perceptually identical stimuli. Multiple successive numerical distance effects were detected in event-related potentials (ERPs). Analysis of oscillatory activity further showed at least two distinct stages of neural processes involved in the automatic analysis of numerical magnitude, with the earlier effect emerging at around 200ms and the later effect appearing at around 400ms. The findings support for the hypothesis that numerical magnitude processing involves a succession of cognitive events. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  5. A cross-sectional analysis of pharmaceutical industry-funded events for health professionals in Australia

    PubMed Central

    Fabbri, Alice; Grundy, Quinn; Mintzes, Barbara; Swandari, Swestika; Moynihan, Ray; Walkom, Emily; Bero, Lisa A

    2017-01-01

    Objectives To analyse patterns and characteristics of pharmaceutical industry sponsorship of events for Australian health professionals and to understand the implications of recent changes in transparency provisions that no longer require reporting of payments for food and beverages. Design Cross-sectional analysis. Participants and setting 301 publicly available company transparency reports downloaded from the website of Medicines Australia, the pharmaceutical industry trade association, covering the period from October 2011 to September 2015. Results Forty-two companies sponsored 116 845 events for health professionals, on average 608 per week with 30 attendees per event. Events typically included a broad range of health professionals: 82.0% included medical doctors, including specialists and primary care doctors, and 38.3% trainees. Oncology, surgery and endocrinology were the most frequent clinical areas of focus. Most events (64.2%) were held in a clinical setting. The median cost per event was $A263 (IQR $A153–1195) and over 90% included food and beverages. Conclusions Over this 4-year period, industry-sponsored events were widespread and pharmaceutical companies maintained a high frequency of contact with health professionals. Most events were held in clinical settings, suggesting a pervasive commercial presence in everyday clinical practice. Food and beverages, known to be associated with changes to prescribing practice, were almost always provided. New Australian transparency provisions explicitly exclude meals from the reporting requirements; thus, a large proportion of potentially influential payments from pharmaceutical companies to health professionals will disappear from public view. PMID:28667226

  6. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    NASA Astrophysics Data System (ADS)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  7. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    NASA Astrophysics Data System (ADS)

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  8. Microarray analysis identifies candidate genes for key roles in coral development

    PubMed Central

    Grasso, Lauretta C; Maindonald, John; Rudd, Stephen; Hayward, David C; Saint, Robert; Miller, David J; Ball, Eldon E

    2008-01-01

    Background Anthozoan cnidarians are amongst the simplest animals at the tissue level of organization, but are surprisingly complex and vertebrate-like in terms of gene repertoire. As major components of tropical reef ecosystems, the stony corals are anthozoans of particular ecological significance. To better understand the molecular bases of both cnidarian development in general and coral-specific processes such as skeletogenesis and symbiont acquisition, microarray analysis was carried out through the period of early development – when skeletogenesis is initiated, and symbionts are first acquired. Results Of 5081 unique peptide coding genes, 1084 were differentially expressed (P ≤ 0.05) in comparisons between four different stages of coral development, spanning key developmental transitions. Genes of likely relevance to the processes of settlement, metamorphosis, calcification and interaction with symbionts were characterised further and their spatial expression patterns investigated using whole-mount in situ hybridization. Conclusion This study is the first large-scale investigation of developmental gene expression for any cnidarian, and has provided candidate genes for key roles in many aspects of coral biology, including calcification, metamorphosis and symbiont uptake. One surprising finding is that some of these genes have clear counterparts in higher animals but are not present in the closely-related sea anemone Nematostella. Secondly, coral-specific processes (i.e. traits which distinguish corals from their close relatives) may be analogous to similar processes in distantly related organisms. This first large-scale application of microarray analysis demonstrates the potential of this approach for investigating many aspects of coral biology, including the effects of stress and disease. PMID:19014561

  9. Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets

    PubMed Central

    Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo

    2017-01-01

    The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms. DOI: http://dx.doi.org/10.7554/eLife.25818.001 PMID:28786378

  10. Asthma exacerbations in children immediately following stressful life events: a Cox's hierarchical regression.

    PubMed

    Sandberg, S; Järvenpää, S; Penttinen, A; Paton, J Y; McCann, D C

    2004-12-01

    A recent prospective study of children with asthma employing a within subject, over time analysis using dynamic logistic regression showed that severely negative life events significantly increased the risk of an acute exacerbation during the subsequent 6 week period. The timing of the maximum risk depended on the degree of chronic psychosocial stress also present. A hierarchical Cox regression analysis was undertaken to examine whether there were any immediate effects of negative life events in children without a background of high chronic stress. Sixty children with verified chronic asthma were followed prospectively for 18 months with continuous monitoring of asthma by daily symptom diaries and peak flow measurements, accompanied by repeated interview assessments of life events. The key outcome measures were asthma exacerbations and severely negative life events. An immediate effect evident within the first 2 days following a severely negative life event increased the risk of a new asthma attack by a factor of 4.69, 95% confidence interval 2.33 to 9.44 (p<0.001) [corrected] In the period 3-10 days after a severe event there was no increased risk of an asthma attack (p = 0.5). In addition to the immediate effect, an increased risk of 1.81 (95% confidence interval 1.24 to 2.65) [corrected] was found 5-7 weeks after a severe event (p = 0.002). This is consistent with earlier findings. There was a statistically significant variation due to unobserved factors in the incidence of asthma attacks between the children. The use of statistical methods capable of investigating short time lags showed that stressful life events significantly increase the risk of a new asthma attack immediately after the event; a more delayed increase in risk was also evident 5-7 weeks later.

  11. Whole Genome Sequence Analysis of Mutations Accumulated in rad27Δ Yeast Strains with Defects in the Processing of Okazaki Fragments Indicates Template-Switching Events

    PubMed Central

    Omer, Sumita; Lavi, Bar; Mieczkowski, Piotr A.; Covo, Shay; Hazkani-Covo, Einat

    2017-01-01

    Okazaki fragments that are formed during lagging strand DNA synthesis include an initiating primer consisting of both RNA and DNA. The RNA fragment must be removed before the fragments are joined. In Saccharomyces cerevisiae, a key player in this process is the structure-specific flap endonuclease, Rad27p (human homolog FEN1). To obtain a genomic view of the mutational consequence of loss of RAD27, a S. cerevisiae rad27Δ strain was subcultured for 25 generations and sequenced using Illumina paired-end sequencing. Out of the 455 changes observed in 10 colonies isolated the two most common types of events were insertions or deletions (INDELs) in simple sequence repeats (SSRs) and INDELs mediated by short direct repeats. Surprisingly, we also detected a previously neglected class of 21 template-switching events. These events were presumably generated by quasi-palindrome to palindrome correction, as well as palindrome elongation. The formation of these events is best explained by folding back of the stalled nascent strand and resumption of DNA synthesis using the same nascent strand as a template. Evidence of quasi-palindrome to palindrome correction that could be generated by template switching appears also in yeast genome evolution. Out of the 455 events, 55 events appeared in multiple isolates; further analysis indicates that these loci are mutational hotspots. Since Rad27 acts on the lagging strand when the leading strand should not contain any gaps, we propose a mechanism favoring intramolecular strand switching over an intermolecular mechanism. We note that our results open new ways of understanding template switching that occurs during genome instability and evolution. PMID:28974572

  12. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  13. Key Parameters Evaluation for Hip Prosthesis with Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Guo, Hongqiang; Li, Dichen; Lian, Qin; Li, Xiang; Jin, Zhongmin

    2007-09-01

    Stem length and cross section are two key parameters that influence the stability and longevity of metallic hip prosthesis in the total hip arthroplasty (THA). In order to assess their influence to the stress and fatigue behavior of hip prosthesis, a series model of hip prosthesis with round-shaped or drum-shaped cross section, and with different stem lengths were created. These models were analyzed under both static and dynamic loading conditions with finite element analysis, and dynamic loading represents normal walking was used in the dynamic analysis. The stress on the metallic stem, cement, and adjacent bone were got, micromotion on the cement-metal interface were got too. Safety factors for fatigue life of the hip prothesis were calculated based on data obtained from dynamic analysis. Static analysis shows that drum-shaped cross section can decrease the displacement of the stem, that stress on drum-shaped stem focus on the corner of the femoral neck and the distal part of hip prosthesis, whereas the stress on the round-shaped stem distributes evenly over most part of the stem, and maximum stress on stem prosthesis fluctuates with stem length bottoming out at stem length range from 80 mm to 110 mm, that drum-shaped stems with drum height 8 mm generate more stress at the distal part of stem than drum-shaped stems with drum height 10 mm and round stems do. Dynamic and fatigue analysis shows that drum-shaped stem with drum height 10 mm and stem length 90 mm has the greatest safety factor therefore long fatigue life.

  14. Ontology-Based Combinatorial Comparative Analysis of Adverse Events Associated with Killed and Live Influenza Vaccines

    PubMed Central

    Sarntivijai, Sirarat; Xiang, Zuoshuang; Shedden, Kerby A.; Markel, Howard; Omenn, Gilbert S.; Athey, Brian D.; He, Yongqun

    2012-01-01

    Vaccine adverse events (VAEs) are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE) profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed) inactivated influenza vaccine (TIV) and trivalent live attenuated influenza vaccine (LAIV). Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS). Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone) and 3,707 AE reports for the only LAIV (FluMist). The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE). CODAE detects AEs using Proportional Reporting Ratio (PRR), Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports). These AE terms were classified using the Ontology of Adverse Events (OAE), MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis) present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these two

  15. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage (FOD) Events

    NASA Technical Reports Server (NTRS)

    Turso, James; Lawrence, Charles; Litt, Jonathan

    2004-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  16. Reduced-Order Modeling and Wavelet Analysis of Turbofan Engine Structural Response Due to Foreign Object Damage "FOD" Events

    NASA Technical Reports Server (NTRS)

    Turso, James A.; Lawrence, Charles; Litt, Jonathan S.

    2007-01-01

    The development of a wavelet-based feature extraction technique specifically targeting FOD-event induced vibration signal changes in gas turbine engines is described. The technique performs wavelet analysis of accelerometer signals from specified locations on the engine and is shown to be robust in the presence of significant process and sensor noise. It is envisioned that the technique will be combined with Kalman filter thermal/ health parameter estimation for FOD-event detection via information fusion from these (and perhaps other) sources. Due to the lack of high-frequency FOD-event test data in the open literature, a reduced-order turbofan structural model (ROM) was synthesized from a finite-element model modal analysis to support the investigation. In addition to providing test data for algorithm development, the ROM is used to determine the optimal sensor location for FOD-event detection. In the presence of significant noise, precise location of the FOD event in time was obtained using the developed wavelet-based feature.

  17. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    USDA-ARS?s Scientific Manuscript database

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  18. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  19. Total Thrombus-formation Analysis System Predicts Periprocedural Bleeding Events in Patients With Coronary Artery Disease Undergoing Percutaneous Coronary Intervention.

    PubMed

    Oimatsu, Yu; Kaikita, Koichi; Ishii, Masanobu; Mitsuse, Tatsuro; Ito, Miwa; Arima, Yuichiro; Sueta, Daisuke; Takahashi, Aya; Iwashita, Satomi; Yamamoto, Eiichiro; Kojima, Sunao; Hokimoto, Seiji; Tsujita, Kenichi

    2017-04-24

    Periprocedural bleeding events are common after percutaneous coronary intervention. We evaluated the association of periprocedural bleeding events with thrombogenicity, which was measured quantitatively by the Total Thrombus-formation Analysis System equipped with microchips and thrombogenic surfaces (collagen, platelet chip [PL]; collagen plus tissue factor, atheroma chip [AR]). Between August 2013 and March 2016, 313 consecutive patients with coronary artery disease undergoing elective percutaneous coronary intervention were enrolled. They were divided into those with or without periprocedural bleeding events. We determined the bleeding events as composites of major bleeding events defined by the International Society on Thrombosis and Hemostasis and minor bleeding events (eg, minor hematoma, arteriovenous shunt and pseudoaneurysm). Blood samples obtained at percutaneous coronary intervention were analyzed for thrombus formation area under the curve (PL 24 -AUC 10 for PL chip; AR 10 -AUC 30 for AR chip) by the Total Thrombus-formation Analysis System and P2Y12 reaction unit by the VerifyNow system. Periprocedural bleeding events occurred in 37 patients. PL 24 -AUC 10 levels were significantly lower in patients with such events than those without ( P =0.002). Multiple logistic regression analyses showed association between low PL 24 -AUC 10 levels and periprocedural bleeding events (odds ratio, 2.71 [1.22-5.99]; P =0.01) and association between PL 24 -AUC 10 and periprocedural bleeding events in 176 patients of the femoral approach group (odds ratio, 2.88 [1.11-7.49]; P =0.03). However, PL 24 -AUC 10 levels in 127 patients of the radial approach group were not significantly different in patients with or without periprocedural bleeding events. PL 24 -AUC 10 measured by the Total Thrombus-formation Analysis System is a potentially useful predictor of periprocedural bleeding events in coronary artery disease patients undergoing elective percutaneous coronary

  20. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  1. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    NASA Astrophysics Data System (ADS)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  2. Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Kumar, Ranjan; Ghosh, Achyuta Krishna

    2017-04-01

    Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.

  3. Final Report for Dynamic Models for Causal Analysis of Panel Data. Approaches to the Censoring Problem in Analysis of Event Histories. Part III, Chapter 2.

    ERIC Educational Resources Information Center

    Tuma, Nancy Brandon; Hannan, Michael T.

    The document, part of a series of chapters described in SO 011 759, considers the problem of censoring in the analysis of event-histories (data on dated events, including dates of change from one qualitative state to another). Censoring refers to the lack of information on events that occur before or after the period for which data are available.…

  4. A Systematic Review and Meta-analysis of Thrombotic Events Following Endovenous Thermal Ablation of the Great Saphenous Vein.

    PubMed

    Healy, Donagh A; Kimura, Shiori; Power, David; Elhaj, Abubaker; Abdeldaim, Yasser; Cross, Keith S; McGreal, Gerard T; Burke, Paul E; Moloney, Tony; Manning, Brian J; Kavanagh, Eamon G

    2018-06-09

    A systematic review and meta-analysis was performed to determine the incidence of thrombotic events following great saphenous vein (GSV) endovenous thermal ablation (EVTA). MEDLINE, Embase and conference abstracts were searched. Eligible studies were randomised controlled trials and case series that included at least 100 patients who underwent GSV EVTA (laser ablation or radiofrequency ablation [RFA]) with duplex ultrasound (DUS) within 30 days. The systematic review focused on the complications of endovenous heat induced thrombosis (EHIT), deep venous thrombosis (DVT), and pulmonary embolism (PE). The primary outcome for the meta-analysis was deep venous thrombotic events which were defined as DVT or EHIT Type 2, 3, or 4. Secondary outcomes for the meta-analysis were EHIT Type 2, 3, or 4, DVT and PE. Subgroup analyses were performed for both the RFA and EVLA groups. Pooled proportions were calculated using random effects modelling. Fifty-two studies (16,398 patients) were included. Thrombotic complications occurred infrequently. Deep venous thrombotic events occurred in 1.7% of cases (95% CI 0.9-2.7%) (25 studies; 10,012 patients; 274 events). EHIT Type 2, 3, or 4 occurred in 1.4% of cases (95% CI 0.8-2.3%) (26 studies; 10,225 patients; 249 events). DVT occurred in 0.3% of cases (95% CI = 0.2%-0.5%) (49 studies; 15,676 patients; 48 events). PE occurred in 0.1% of cases (95% CI = 0.1-0.2%) (29 studies; 8223 patients; 3 events). Similar results were found when the RFA and EVLA groups were analysed separately. Thrombotic events occur infrequently following GSV EVTA. Given the large numbers of procedures worldwide and the potential for serious consequences, further research is needed on the burden of these complications and their management. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.

  5. Incidence and risk factors of intraoperative adverse events during donor lobectomy for living-donor liver transplantation: a retrospective analysis.

    PubMed

    Araz, Coskun; Pirat, Arash; Unlukaplan, Aytekin; Torgay, Adnan; Karakayali, Hamdi; Arslan, Gulnaz; Moray, Gokhan; Haberal, Mehmet

    2012-04-01

    To evaluate the frequency, type, and predictors of intraoperative adverse events during donor hepatectomy for living-donor liver transplant. Retrospective analyses of the data from 182 consecutive living-donor liver transplant donors between May 2002 and September 2008. Ninety-one patients (50%) had at least 1 intraoperative adverse event including hypothermia (39%), hypotension (26%), need for transfusions (17%), and hypertension (7%). Patients with an adverse event were older (P = .001), had a larger graft weight (P = .023), more frequently underwent a right hepatectomy (P = .019), and were more frequently classified as American Society of Anesthesiologists physical status class II (P = .027) than those who did not have these adverse events. Logistic regression analysis revealed that only age (95% confidence interval 1.018-1.099; P = .001) was a risk factor for intraoperative adverse events. Patients with these adverse events more frequently required admission to the intensive care unit and were hospitalized longer postoperatively. A before and after analysis showed that after introduction of in-line fluid warmers and more frequent use of acute normovolemic hemodilution, the frequency of intraoperative adverse events was significantly lower (80% vs 29%; P < .001). Intraoperative adverse events such as hypothermia and hypotension were common in living-donor liver transplant donors, and older age was associated with an increased risk of these adverse events. However, the effect of these adverse events on postoperative recovery is not clear.

  6. Event selection services in ATLAS

    NASA Astrophysics Data System (ADS)

    Cranshaw, J.; Cuhadar-Donszelmann, T.; Gallas, E.; Hrivnac, J.; Kenyon, M.; McGlone, H.; Malon, D.; Mambelli, M.; Nowak, M.; Viegas, F.; Vinek, E.; Zhang, Q.

    2010-04-01

    ATLAS has developed and deployed event-level selection services based upon event metadata records ("TAGS") and supporting file and database technology. These services allow physicists to extract events that satisfy their selection predicates from any stage of data processing and use them as input to later analyses. One component of these services is a web-based Event-Level Selection Service Interface (ELSSI). ELSSI supports event selection by integrating run-level metadata, luminosity-block-level metadata (e.g., detector status and quality information), and event-by-event information (e.g., triggers passed and physics content). The list of events that survive after some selection criterion is returned in a form that can be used directly as input to local or distributed analysis; indeed, it is possible to submit a skimming job directly from the ELSSI interface using grid proxy credential delegation. ELSSI allows physicists to explore ATLAS event metadata as a means to understand, qualitatively and quantitatively, the distributional characteristics of ATLAS data. In fact, the ELSSI service provides an easy interface to see the highest missing ET events or the events with the most leptons, to count how many events passed a given set of triggers, or to find events that failed a given trigger but nonetheless look relevant to an analysis based upon the results of offline reconstruction, and more. This work provides an overview of ATLAS event-level selection services, with an emphasis upon the interactive Event-Level Selection Service Interface.

  7. Low frequency events on Montserrat

    NASA Astrophysics Data System (ADS)

    Visser, K.; Neuberg, J.

    2003-04-01

    Earthquake swarms observed on volcanoes consist generally of low frequency events. The low frequency content of these events indicates the presence of interface waves at the boundary of the magma filled conduit and the surrounding country rock. The observed seismic signal at the surface shows therefore a complicated interference pattern of waves originating at various parts of the magma filled conduit, interacting with the free surface and interfaces in the volcanic edifice. This research investigates the applicability of conventional seismic tools on these low frequency events, focusing on hypocenter location analysis using arrival times and particle motion analysis for the Soufrière Hills Volcano on Montserrat. Both single low frequency events and swarms are observed on this volcano. Synthetic low frequency events are used for comparison. Results show that reliable hypocenter locations and particle motions can only be obtained if the low frequency events are single events with an identifiable P wave onset, for example the single events preceding swarms on Montserrat or the first low frequency event of a swarm. Consecutive events of the same swarm are dominated by interface waves which are converted at the top of the conduit into weak secondary P waves and surface waves. Conventional seismic tools fail to correctly analyse these events.

  8. Superposed ruptile deformational events revealed by field and VOM structural analysis

    NASA Astrophysics Data System (ADS)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and

  9. Using Key Part-of-Speech Analysis to Examine Spoken Discourse by Taiwanese EFL Learners

    ERIC Educational Resources Information Center

    Lin, Yen-Liang

    2015-01-01

    This study reports on a corpus analysis of samples of spoken discourse between a group of British and Taiwanese adolescents, with the aim of exploring the statistically significant differences in the use of grammatical categories between the two groups of participants. The key word method extended to a part-of-speech level using the web-based…

  10. Reinvestigation and analysis a landslide dam event in 2012 using UAV

    NASA Astrophysics Data System (ADS)

    Wang, Kuo-Lung; Huang, Zji-Jie; Lin, Jun-Tin

    2015-04-01

    Geological condition of Taiwan is fracture with locating on Pacific Rim seismic area. Typhoons usually attack during summer and steep mountains are highly weathered, which induces landslide in mountain area. The situation happens more frequently recent years due to weather change effect. Most landslides are very far away from residence area. Field investigation is time consuming, high budget, limited data collected and dangerous. Investigation with satellite images has disadvantages such as less of the actual situation and poor resolution. Thus the possibility for slope investigation with UAV will be proposed and discussed in this research. Hazard investigation and monitoring is adopted UAV in recent years. UAV has advantages such as light weight, small volume, high mobility, safe, easy maintenance and low cost. Investigation can be executed in high risk area. Use the mature aero photogrammetry , combines aero photos with control point. Digital surface model (DSM) and Ortho photos can be produced with control points aligned. The resolution can be less than 5cm thus can be used as temporal creeping monitoring before landslide happens. A large landslide site at 75k of road No. 14 was investigated in this research. Landslide happened in June, 2012 with heavy rainfall and landslide dam was formed quickly after that. Analysis of this landslide failure and mechanism were discussed in this research using DEMs produced prior this event with aero photos and after this event with UAV. Residual slope stability analysis is thus carried out after strength parameters obtain from analysis described above. Thus advice for following potential landslide conditions can be provided.

  11. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    PubMed

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  12. Grieving experiences amongst adolescents orphaned by AIDS: Analysis from event history calendars.

    PubMed

    Thupayagale-Tshweneagae, Gloria

    2012-09-07

    Mental health is an essential component of adolescent health and wellbeing. Mental health practitioners assess adolescents' mental health status to identify possible issues that may lead to mental health problems. However, very few of the tools used to assess the mental health status of adolescents include assessment for grieving and coping patterns. The current tools used for assessing an individual's mental health are lengthy and not comprehensive. The purpose of this study was to assess grieving patterns of adolescents orphaned by AIDS and to appraise the usefulness of an event history calendar as an assessment tool for identifying grieving experiences, in order to guide and support these adolescents through the grieving process. One hundred and two adolescents aged 14-18 years, who had been orphaned by AIDS, completed an event history calendar, reviewed it with the researcher and reported their perceptions of it. Thematic analysis of the event history calendar content revealed that it is an effective, time-efficient, adolescent-friendly tool that facilitated identification and discussion of the orphaned adolescents' grieving patterns. Crying, isolation, silence and violent outbursts were the main grieving patterns reported by adolescents orphaned by AIDS. The researcher recommends use of the event history calendar for identification of orphaned adolescents' grieving experiences. Early identification would enable mental health practitioners to support them in order to prevent the occurrence of mental illness due to maladaptive grieving.

  13. Combining geomorphic and documentary flood evidence to reconstruct extreme events in Mediterranean basins

    NASA Astrophysics Data System (ADS)

    Thorndycraft, V. R.; Benito, G.; Barriendos, M.; Rico, M.; Sánchez-Moya, Y.; Sopeña, A.; Casas, A.

    2009-09-01

    Palaeoflood hydrology is the reconstruction of flood magnitude and frequency using geomorphological flood evidence and is particularly valuable for extending the record of extreme floods prior to the availability of instrumental data series. This paper will provide a review of recent developments in palaeoflood hydrology and will be presented in three parts: 1) an overview of the key methodological approaches used in palaeoflood hydrology and the use of historical documentary evidence for reconstructing extreme events; 2) a summary of the Llobregat River palaeoflood case study (Catalonia, NE Spain); and 3) analysis of the AD 1617 flood and its impacts across Catalonia (including the rivers Llobregat, Ter and Segre). The key findings of the Llobregat case study were that at least eight floods occurred with discharges significantly larger than events recorded in the instrumental record, for example at the Pont de Vilomara study reach the palaeodischarges of these events were 3700-4300 m3/s compared to the 1971 flood, the largest on record, of 2300 m3/s. Five of these floods were dated to the last 3000 years and the three events directly dated by radiocarbon all occurred during cold phases of global climate. Comparison of the palaeoflood record with documentary evidence indicated that one flood, radiocarbon dated to cal. AD 1540-1670, was likely to be the AD 1617 event, the largest flood of the last 700 years. Historical records indicate that this event was caused by rainfall occurring from the 2nd to 6th November and the resultant flooding caused widespread socio-economic impacts including the destruction of at least 389 houses, 22 bridges and 17 water mills. Discharges estimated from palaeoflood records and historical flood marks indicate that the Llobregat (4680 m3/s) and Ter (2700-4500 m3/s) rivers witnessed extreme discharges in comparison to observed floods in the instrumental record (2300 and 2350 m3/s, respectively); whilst further east in the Segre River

  14. Key drivers of precipitation isotopes in Windhoek, Namibia (2012-2016)

    NASA Astrophysics Data System (ADS)

    Kaseke, K. F.; Wang, L.; Wanke, H.

    2017-12-01

    Southern African climate is characterized by large variability with precipitation model estimates varying by as much as 70% during summer. This difference between model estimates is partly because most models associate precipitation over Southern Africa with moisture inputs from the Indian Ocean while excluding inputs from the Atlantic Ocean. However, growing evidence suggests that the Atlantic Ocean may also contribute significant amounts of moisture to the region. This four-year (2012-2016) study investigates the isotopic composition (δ18O, δ2H and δ17O) of event-scale precipitation events, the key drivers of isotope variations and the origins of precipitation experienced in Windhoek, Namibia. Results indicate large storm-to-storm isotopic variability δ18O (25‰), δ2H (180‰) and δ17O (13‰) over the study period. Univariate analysis showed significant correlations between event precipitation isotopes and local meteorological parameters; lifted condensation level, relative humidity (RH), precipitation amount, average wind speed, surface and air temperature (p < 0.05). The number of significant correlations between local meteorological parameters and monthly isotopes was much lower suggesting loss of information through data aggregation. Nonetheless, the most significant isotope driver at both event and monthly scales was RH, consistent with the semi-arid classification of the site. Multiple linear regression analysis suggested RH, precipitation amount and air temperature were the most significant local drivers of precipitation isotopes accounting for about 50% of the variation implying that about 50% could be attributed to source origins. HYSLPIT trajectories indicated that 78% of precipitation originated from the Indian Ocean while 21% originated from the Atlantic Ocean. Given that three of the four study years were droughts while two of the three drought years were El Niño related, our data also suggests that δ'17O-δ'18O could be a useful tool to

  15. Identifying Key Hospital Service Quality Factors in Online Health Communities

    PubMed Central

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain

    2015-01-01

    78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media–based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). Conclusions These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies. PMID:25855612

  16. Identifying key hospital service quality factors in online health communities.

    PubMed

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media-based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies.

  17. Key Informant Interviews with Coordinators of Special Events Conducted to Increase Cancer Screening in the United States

    ERIC Educational Resources Information Center

    Escoffery, Cam; Rodgers, Kirsten; Kegler, Michelle C.; Haardörfer, Regine; Howard, David; Roland, Katherine B.; Wilson, Katherine M.; Castro, Georgina; Rodriguez, Juan

    2014-01-01

    Special events such as health fairs, cultural festivals and charity runs are commonly employed in the community to increase cancer screening; however, little is known about their effectiveness. The purpose of this study is to assess the activities, screening outcomes, barriers and recommendations of special events to increase breast, cervical and…

  18. Accounting for unintended binding events in the analysis of quartz crystal microbalance kinetic data.

    PubMed

    Heller, Gabriella T; Zwang, Theodore J; Sarapata, Elizabeth A; Haber, Michael A; Sazinsky, Matthew H; Radunskaya, Ami E; Johal, Malkiat S

    2014-05-01

    Previous methods for analyzing protein-ligand binding events using the quartz crystal microbalance with dissipation monitoring (QCM-D) fail to account for unintended binding that inevitably occurs during surface measurements and obscure kinetic information. In this article, we present a system of differential equations that accounts for both reversible and irreversible unintended interactions. This model is tested on three protein-ligand systems, each of which has different features, to establish the feasibility of using the QCM-D for protein binding analysis. Based on this analysis, we were able to obtain kinetic information for the intended interaction that is consistent with those obtained in literature via bulk-phase methods. In the appendix, we include a method for decoupling these from the intended binding events and extracting relevant affinity information. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Genome Alignment Spanning Major Poaceae Lineages Reveals Heterogeneous Evolutionary Rates and Alters Inferred Dates for Key Evolutionary Events.

    PubMed

    Wang, Xiyin; Wang, Jingpeng; Jin, Dianchuan; Guo, Hui; Lee, Tae-Ho; Liu, Tao; Paterson, Andrew H

    2015-06-01

    Multiple comparisons among genomes can clarify their evolution, speciation, and functional innovations. To date, the genome sequences of eight grasses representing the most economically important Poaceae (grass) clades have been published, and their genomic-level comparison is an essential foundation for evolutionary, functional, and translational research. Using a formal and conservative approach, we aligned these genomes. Direct comparison of paralogous gene pairs all duplicated simultaneously reveal striking variation in evolutionary rates among whole genomes, with nucleotide substitution slowest in rice and up to 48% faster in other grasses, adding a new dimension to the value of rice as a grass model. We reconstructed ancestral genome contents for major evolutionary nodes, potentially contributing to understanding the divergence and speciation of grasses. Recent fossil evidence suggests revisions of the estimated dates of key evolutionary events, implying that the pan-grass polyploidization occurred ∼96 million years ago and could not be related to the Cretaceous-Tertiary mass extinction as previously inferred. Adjusted dating to reflect both updated fossil evidence and lineage-specific evolutionary rates suggested that maize subgenome divergence and maize-sorghum divergence were virtually simultaneous, a coincidence that would be explained if polyploidization directly contributed to speciation. This work lays a solid foundation for Poaceae translational genomics. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.

  20. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration.

    PubMed

    Renfro, Lindsay A; Grothey, Axel M; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J

    2014-12-01

    Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Real-time projection of the final analysis time during a

  1. Projecting Event-Based Analysis Dates in Clinical Trials: An Illustration Based on the International Duration Evaluation of Adjuvant Chemotherapy (IDEA) Collaboration. Projecting analysis dates for the IDEA collaboration

    PubMed Central

    Renfro, Lindsay A.; Grothey, Axel M.; Paul, James; Floriani, Irene; Bonnetain, Franck; Niedzwiecki, Donna; Yamanaka, Takeharu; Souglakos, Ioannis; Yothers, Greg; Sargent, Daniel J.

    2015-01-01

    Purpose Clinical trials are expensive and lengthy, where success of a given trial depends on observing a prospectively defined number of patient events required to answer the clinical question. The point at which this analysis time occurs depends on both patient accrual and primary event rates, which typically vary throughout the trial's duration. We demonstrate real-time analysis date projections using data from a collection of six clinical trials that are part of the IDEA collaboration, an international preplanned pooling of data from six trials testing the duration of adjuvant chemotherapy in stage III colon cancer, and we additionally consider the hypothetical impact of one trial's early termination of follow-up. Patients and Methods In the absence of outcome data from IDEA, monthly accrual rates for each of the six IDEA trials were used to project subsequent trial-specific accrual, while historical data from similar Adjuvant Colon Cancer Endpoints (ACCENT) Group trials were used to construct a parametric model for IDEA's primary endpoint, disease-free survival, under the same treatment regimen. With this information and using the planned total accrual from each IDEA trial protocol, individual patient accrual and event dates were simulated and the overall IDEA interim and final analysis times projected. Projections were then compared with actual (previously undisclosed) trial-specific event totals at a recent census time for validation. The change in projected final analysis date assuming early termination of follow-up for one IDEA trial was also calculated. Results Trial-specific predicted event totals were close to the actual number of events per trial for the recent census date at which the number of events per trial was known, with the overall IDEA projected number of events only off by eight patients. Potential early termination of follow-up by one IDEA trial was estimated to postpone the overall IDEA final analysis date by 9 months. Conclusions Real

  2. Detection and analysis of high-temperature events in the BIRD mission

    NASA Astrophysics Data System (ADS)

    Zhukov, Boris; Briess, Klaus; Lorenz, Eckehard; Oertel, Dieter; Skrbek, Wolfgang

    2005-01-01

    The primary mission objective of a new small Bi-spectral InfraRed Detection (BIRD) satellite is detection and quantitative analysis of high-temperature events like fires and volcanoes. An absence of saturation in the BIRD infrared channels makes it possible to improve false alarm rejection as well as to retrieve quantitative characteristics of hot targets, including their effective fire temperature, area and the radiative energy release. Examples are given of detection and analysis of wild and coal seam fires, of volcanic activity as well as of oil fires in Iraq. The smallest fires detected by BIRD, which were verified on ground, had an area of 12m2 at daytime and 4m2 at night.

  3. Timing Processes Are Correlated when Tasks Share a Salient Event

    ERIC Educational Resources Information Center

    Zelaznik, Howard N.; Rosenbaum, David A.

    2010-01-01

    Event timing is manifested when participants make discrete movements such as repeatedly tapping a key. Emergent timing is manifested when participants make continuous movements such as repeatedly drawing a circle. Here we pursued the possibility that providing salient perceptual events to mark the completion of time intervals could allow circle…

  4. Automated Detection of Surgical Adverse Events from Retrospective Clinical Data

    ERIC Educational Resources Information Center

    Hu, Zhen

    2017-01-01

    The Detection of surgical adverse events has become increasingly important with the growing demand for quality improvement and public health surveillance with surgery. Event reporting is one of the key steps in determining the impact of postoperative complications from a variety of perspectives and is an integral component of improving…

  5. Inner core rotation from event-pair analysis

    NASA Astrophysics Data System (ADS)

    Song, Xiaodong; Poupinet, Georges

    2007-09-01

    The last decade has witnessed an animated debate on whether the inner core rotation is a fact or an artifact. Here we examine the temporal change of inner core waves using a technique that compares differential travel times at the same station but between two events. The method does not require precise knowledge of earthquake locations and earth models. The pairing of the events creates a large data set for the application of statistical tools. Using measurements from 87 events in the South Sandwich Islands recorded at College, Alaska station, we conclude the temporal change is robust. The estimates of the temporal change range from about 0.07 to 0.10 s/decade over the past 50 yr. If we used only pairs with small inter-event distances, which reduce the influence of mantle heterogeneity, the rates range from 0.084 to 0.098 s/decade, nearly identical to the rate inferred by Zhang et al. [Zhang, J., Song, X.D., Li, Y.C., Richards, P.G., Sun, X.L., Waldhauser, F., Inner core differential motion confirmed by earthquake waveform doublets, Science 309 (5739) (2005) 1357-1360.] from waveform doublets. The rate of the DF change seems to change with time, which may be explained by lateral variation of the inner core structure or the change in rotation rate on decadal time scale.

  6. Application of Key Events and Analysis to Chemical Carcinogens and Noncarcinogens

    EPA Science Inventory

    The existence of thresholds for toxicants is a matter of debate in chemical rsk assessment and regulation. Current risk assessment methods are based on the assumption that, in the basense of sufficient data, carcinogenesis does not have a threshold, while non-carcinogenic endpoi...

  7. 78 FR 70901 - Safety Zone; Bone Island Triathlon, Atlantic Ocean; Key West, FL

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... 1625-AA00 Safety Zone; Bone Island Triathlon, Atlantic Ocean; Key West, FL AGENCY: Coast Guard, DHS... zone on the waters of the Atlantic Ocean in Key West, Florida, during the Bone Island Triathlon on... event. C. Discussion of Proposed Rule On January 25, 2014, Questor Multisport, LLC. is hosting the Bone...

  8. 77 FR 75853 - Safety Zone; Bone Island Triathlon, Atlantic Ocean; Key West, FL

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ...-AA00 Safety Zone; Bone Island Triathlon, Atlantic Ocean; Key West, FL AGENCY: Coast Guard, DHS. ACTION... Atlantic Ocean in Key West, Florida, during the Bone Island Triathlon on Saturday, January 12, 2013. The..., Questor Multisport, LLC is hosting the Bone Island Triathlon. The event will be held on the waters of the...

  9. Wheat domestication: Key to agricultural revolutions past and future

    USDA-ARS?s Scientific Manuscript database

    The domestication of wheat was instrumental in the transition of human behavior from hunter-gatherers to farmers. It was a key event in the agricultural revolution that occurred about 10,000 years ago in the Fertile Crescent of the Middle East. Transitions of forms with natural seed dispersal mechan...

  10. Early blood pressure lowering treatment in acute stroke. Ordinal analysis of vascular events in the Scandinavian Candesartan Acute Stroke Trial (SCAST).

    PubMed

    Jusufovic, Mirza; Sandset, Else Charlotte; Bath, Philip M; Berge, Eivind

    2016-08-01

    Early blood pressure-lowering treatment appears to be beneficial in patients with acute intracerebral haemorrhage and potentially in ischaemic stroke. We used a new method for analysis of vascular events in the Scandinavian Candesartan Acute Stroke Trial to see if the effect was dependent on the timing of treatment. Scandinavian Candesartan Acute Stroke Trial was a randomized controlled and placebo-controlled trial of candesartan within 30 h of ischaemic or haemorrhagic stroke. Of 2029 patients, 231 (11.4%) had a vascular event (vascular death, nonfatal stroke or nonfatal myocardial infarction) during the first 6 months. The modified Rankin Scale (mRS) score following a vascular event was used to categorize vascular events in order of severity: no event (n = 1798), minor (mRS 0-2, n = 59), moderately severe (mRS 3-4, n = 57) and major event (mRS 5-6, n = 115). We used ordinal logistic regression for analysis and adjusted for predefined prognostic variables. Candesartan had no overall effect on vascular events (adjusted common odds ratio 1.11, 95% confidence interval 0.84-1.47, P = 0.48), and the effects were the same in ischaemic and haemorrhagic stroke. Among the patients treated within 6 h, the adjusted common odds ratio for vascular events was 0.37, 95% confidence interval 0.16-0.84, P = 0.02, and there was no heterogeneity of effect between ischaemic and haemorrhagic strokes. Ordinal analysis of vascular events showed no overall effect of candesartan in the subacute phase of stroke. The effect of treatment given within 6 h of stroke onset appears promising, and will be addressed in ongoing trials. Ordinal analysis of vascular events is feasible and can be used in future trials.

  11. Stock market returns and clinical trial results of investigational compounds: an event study analysis of large biopharmaceutical companies.

    PubMed

    Hwang, Thomas J

    2013-01-01

    For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: -2.3, 13.4%; P = 0.02) for positive events and -2.0% (95% CI: -9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: -3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were -1.7% (95% CI: -9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than

  12. Stock Market Returns and Clinical Trial Results of Investigational Compounds: An Event Study Analysis of Large Biopharmaceutical Companies

    PubMed Central

    Hwang, Thomas J.

    2013-01-01

    Background For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Methods Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. Results We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: –2.3, 13.4%; P = 0.02) for positive events and –2.0% (95% CI: –9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: –3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were –1.7% (95% CI: –9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. Conclusions The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return

  13. A matter of definition--key elements identified in a discourse analysis of definitions of palliative care.

    PubMed

    Pastrana, T; Jünger, S; Ostgathe, C; Elsner, F; Radbruch, L

    2008-04-01

    For more than 30 years, the term "palliative care" has been used. From the outset, the term has undergone a series of transformations in its definitions and consequently in its tasks and goals. There remains a lack of consensus on a definition. The aim of this article is to analyse the definitions of palliative care in the specialist literature and to identify the key elements of palliative care using discourse analysis: a qualitative methodology. The literature search focused on definitions of the term 'palliative medicine' and 'palliative care' in the World Wide Web and medical reference books in English and German. A total of 37 English and 26 German definitions were identified and analysed. Our study confirmed the lack of a consistent meaning concerning the investigated terms, reflecting on-going discussion about the nature of the field among palliative care practitioners. Several common key elements were identified. Four main categories emerged from the discourse analysis of the definition of palliative care: target groups, structure, tasks and expertise. In addition, the theoretical principles and goals of palliative care were discussed and found to be key elements, with relief and prevention of suffering and improvement of quality of life as main goals. The identified key elements can contribute to the definition of the concept 'palliative care'. Our study confirms the importance of semantic and ethical influences on palliative care that should be considered in future research on semantics in different languages.

  14. Making adjustments to event annotations for improved biological event extraction.

    PubMed

    Baek, Seung-Cheol; Park, Jong C

    2016-09-16

    Current state-of-the-art approaches to biological event extraction train statistical models in a supervised manner on corpora annotated with event triggers and event-argument relations. Inspecting such corpora, we observe that there is ambiguity in the span of event triggers (e.g., "transcriptional activity" vs. 'transcriptional'), leading to inconsistencies across event trigger annotations. Such inconsistencies make it quite likely that similar phrases are annotated with different spans of event triggers, suggesting the possibility that a statistical learning algorithm misses an opportunity for generalizing from such event triggers. We anticipate that adjustments to the span of event triggers to reduce these inconsistencies would meaningfully improve the present performance of event extraction systems. In this study, we look into this possibility with the corpora provided by the 2009 BioNLP shared task as a proof of concept. We propose an Informed Expectation-Maximization (EM) algorithm, which trains models using the EM algorithm with a posterior regularization technique, which consults the gold-standard event trigger annotations in a form of constraints. We further propose four constraints on the possible event trigger annotations to be explored by the EM algorithm. The algorithm is shown to outperform the state-of-the-art algorithm on the development corpus in a statistically significant manner and on the test corpus by a narrow margin. The analysis of the annotations generated by the algorithm shows that there are various types of ambiguity in event annotations, even though they could be small in number.

  15. Recollection-dependent memory for event duration in large-scale spatial navigation

    PubMed Central

    Barense, Morgan D.

    2017-01-01

    Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or familiarity. To investigate this issue, we used a real world virtual reality navigation paradigm where periods of navigation were interspersed with pauses of different durations. Crucially, participants were able to reliably distinguish the durations of events that were subjectively “reexperienced” (i.e., recollected), but not of those that were familiar. This effect was not found in temporal order (ordinal) judgments. We also show that the active experience of the passage of time (holding down a key while waiting) moderately enhanced duration memory accuracy. Memory for event duration, therefore, appears to rely on the hippocampally supported ability to recollect or reexperience an event enabling the reinstatement of both its duration and its spatial context, to distinguish it from other events in a sequence. In contrast, ordinal memory appears to rely on familiarity and recollection to a similar extent. PMID:28202714

  16. Cardiovascular Events Following Smoke-Free Legislations: An Updated Systematic Review and Meta-Analysis

    PubMed Central

    Jones, Miranda R.; Barnoya, Joaquin; Stranges, Saverio; Losonczy, Lia; Navas-Acien, Ana

    2014-01-01

    Background Legislations banning smoking in indoor public places and workplaces are being implemented worldwide to protect the population from secondhand smoke exposure. Several studies have reported reductions in hospitalizations for acute coronary events following the enactment of smoke-free laws. Objective We set out to conduct a systematic review and meta-analysis of epidemiologic studies examining how legislations that ban smoking in indoor public places impact the risk of acute coronary events. Methods We searched MEDLINE, EMBASE, and relevant bibliographies including previous systematic reviews for studies that evaluated changes in acute coronary events, following implementation of smoke-free legislations. Studies were identified through December 2013. We pooled relative risk (RR) estimates for acute coronary events comparing post- vs. pre-legislation using inverse-variance weighted random-effects models. Results Thirty-one studies providing estimates for 47 locations were included. The legislations were implemented between 1991 and 2010. Following the enactment of smoke-free legislations, there was a 12 % reduction in hospitalizations for acute coronary events (pooled RR: 0.88, 95 % CI: 0.85–0.90). Reductions were 14 % in locations that implemented comprehensive legislations compared to an 8 % reduction in locations that only had partial restrictions. In locations with reductions in smoking prevalence post-legislation above the mean (2.1 % reduction) there was a 14 % reduction in events compared to 10 % in locations below the mean. The RRs for acute coronary events associated with enacting smoke-free legislation were 0.87 vs. 0.89 in locations with smoking prevalence pre-legislation above and below the mean (23.1 %), and 0.87 vs. 0.89 in studies from the Americas vs. other regions. Conclusion The implementation of smoke-free legislations was related to reductions in acute coronary event hospitalizations in most populations evaluated. Benefits are greater

  17. Issues in Researching Self-Regulated Learning as Patterns of Events

    ERIC Educational Resources Information Center

    Winne, Philip H.

    2014-01-01

    New methods for gathering and analyzing data about events that comprise self-regulated learning (SRL) support discoveries about patterns among events and tests of hypotheses about roles patterns play in learning. Five such methodologies are discussed in the context of four key questions that shape investigations into patterns in SRL. A framework…

  18. A lengthy look at the daily grind: time series analysis of events, mood, stress, and satisfaction.

    PubMed

    Fuller, Julie A; Stanton, Jeffrey M; Fisher, Gwenith G; Spitzmuller, Christiane; Russell, Steven S; Smith, Patricia C

    2003-12-01

    The present study investigated processes by which job stress and satisfaction unfold over time by examining the relations between daily stressful events, mood, and these variables. Using a Web-based daily survey of stressor events, perceived strain, mood, and job satisfaction completed by 14 university workers, 1,060 occasions of data were collected. Transfer function analysis, a multivariate version of time series analysis, was used to examine the data for relationships among the measured variables after factoring out the contaminating influences of serial dependency. Results revealed a contrast effect in which a stressful event associated positively with higher strain on the same day and associated negatively with strain on the following day. Perceived strain increased over the course of a semester for a majority of participants, suggesting that effects of stress build over time. Finally, the data were consistent with the notion that job satisfaction is a distal outcome that is mediated by perceived strain. ((c) 2003 APA, all rights reserved)

  19. Interaction between the serotonin transporter gene (5-HTTLPR), stressful life events, and risk of depression: a meta-analysis.

    PubMed

    Risch, Neil; Herrell, Richard; Lehner, Thomas; Liang, Kung-Yee; Eaves, Lindon; Hoh, Josephine; Griem, Andrea; Kovacs, Maria; Ott, Jurg; Merikangas, Kathleen Ries

    2009-06-17

    Substantial resources are being devoted to identify candidate genes for complex mental and behavioral disorders through inclusion of environmental exposures following the report of an interaction between the serotonin transporter linked polymorphic region (5-HTTLPR) and stressful life events on an increased risk of major depression. To conduct a meta-analysis of the interaction between the serotonin transporter gene and stressful life events on depression using both published data and individual-level original data. Search of PubMed, EMBASE, and PsycINFO databases through March 2009 yielded 26 studies of which 14 met criteria for the meta-analysis. Criteria for studies for the meta-analyses included published data on the association between 5-HTTLPR genotype (SS, SL, or LL), number of stressful life events (0, 1, 2, > or = 3) or equivalent, and a categorical measure of depression defined by the Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition) or the International Statistical Classification of Diseases, 10th Revision (ICD-10) or use of a cut point to define depression from standardized rating scales. To maximize our ability to use a common framework for variable definition, we also requested original data from all studies published prior to 2008 that met inclusion criteria. Of the 14 studies included in the meta-analysis, 10 were also included in a second sex-specific meta-analysis of original individual-level data. Logistic regression was used to estimate the effects of the number of short alleles at 5-HTTLPR, the number of stressful life events, and their interaction on depression. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated separately for each study and then weighted averages of the individual estimates were obtained using random-effects meta-analysis. Both sex-combined and sex-specific meta-analyses were conducted. Of a total of 14,250 participants, 1769 were classified as having depression; 12,481 as not having

  20. Impacts of Extreme Events on Human Health. Chapter 4

    NASA Technical Reports Server (NTRS)

    Bell, Jesse E.; Herring, Stephanie C.; Jantarasami, Lesley; Adrianopoli, Carl; Benedict, Kaitlin; Conlon, Kathryn; Escobar, Vanessa; Hess, Jeremy; Luvall, Jeffrey; Garcia-Pando, Carlos Perez; hide

    2016-01-01

    Increased Exposure to Extreme Events Key Finding 1: Health impacts associated with climate-related changes in exposure to extreme events include death, injury, or illness; exacerbation of underlying medical conditions; and adverse effects on mental health[High Confidence]. Climate change will increase exposure risk in some regions of the United States due to projected increases in the frequency and/or intensity of drought, wildfires, and flooding related to extreme precipitation and hurricanes [Medium Confidence].Disruption of Essential Infrastructure Key Finding 2: Many types of extreme events related to climate change cause disruption of infrastructure, including power, water, transportation, and communication systems, that are essential to maintaining access to health care and emergency response services and safeguarding human health [High Confidence].Vulnerability to Coastal Flooding Key Finding 3: Coastal populations with greater vulnerability to health impacts from coastal flooding include persons with disabilities or other access and functional needs, certain populations of color, older adults, pregnant women and children, low-income populations, and some occupational groups [High Confidence].Climate change will increase exposure risk to coastal flooding due to increases in extreme precipitation and in hurricane intensity and rainfall rates, as well as sea level rise and the resulting increases in storm surge.

  1. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    NASA Astrophysics Data System (ADS)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  2. Community Response and Engagement During Extreme Water Events in Saskatchewan, Canada and Queensland, Australia

    NASA Astrophysics Data System (ADS)

    McMartin, Dena W.; Sammel, Alison J.; Arbuthnott, Katherine

    2018-01-01

    Technology alone cannot address the challenges of how societies, communities, and individuals understand water accessibility, water management, and water consumption, particularly under extreme conditions like floods and droughts. At the community level, people are increasingly aware challenges related to responses to and impacts of extreme water events. This research begins with an assessment of social and political capacities of communities in two Commonwealth jurisdictions, Queensland, Australia and Saskatchewan, Canada, in response to major flooding events. The research further reviews how such capacities impact community engagement to address and mitigate risks associated with extreme water events and provides evidence of key gaps in skills, understanding, and agency for addressing impacts at the community level. Secondary data were collected using template analysis to elucidate challenges associated with education (formal and informal), social and political capacity, community ability to respond appropriately, and formal government responses to extreme water events in these two jurisdictions. The results indicate that enhanced community engagement alongside elements of an empowerment model can provide avenues for identifying and addressing community vulnerability to negative impacts of flood and drought.

  3. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  4. Analysis and modeling of a hail event consequences on a building portfolio

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  5. Wide coverage biomedical event extraction using multiple partially overlapping corpora

    PubMed Central

    2013-01-01

    Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785

  6. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.

  7. Two damaging hydrogeological events in Calabria, September 2000 and November 2015. Comparative analysis of causes and effects

    NASA Astrophysics Data System (ADS)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela

    2016-04-01

    Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and

  8. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    NASA Astrophysics Data System (ADS)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  9. OAE: The Ontology of Adverse Events.

    PubMed

    He, Yongqun; Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Guo, Abra; Zhang, Shelley; Jagannathan, Desikan; Toldo, Luca; Tao, Cui; Smith, Barry

    2014-01-01

    A medical intervention is a medical procedure or application intended to relieve or prevent illness or injury. Examples of medical interventions include vaccination and drug administration. After a medical intervention, adverse events (AEs) may occur which lie outside the intended consequences of the intervention. The representation and analysis of AEs are critical to the improvement of public health. The Ontology of Adverse Events (OAE), previously named Adverse Event Ontology (AEO), is a community-driven ontology developed to standardize and integrate data relating to AEs arising subsequent to medical interventions, as well as to support computer-assisted reasoning. OAE has over 3,000 terms with unique identifiers, including terms imported from existing ontologies and more than 1,800 OAE-specific terms. In OAE, the term 'adverse event' denotes a pathological bodily process in a patient that occurs after a medical intervention. Causal adverse events are defined by OAE as those events that are causal consequences of a medical intervention. OAE represents various adverse events based on patient anatomic regions and clinical outcomes, including symptoms, signs, and abnormal processes. OAE has been used in the analysis of several different sorts of vaccine and drug adverse event data. For example, using the data extracted from the Vaccine Adverse Event Reporting System (VAERS), OAE was used to analyse vaccine adverse events associated with the administrations of different types of influenza vaccines. OAE has also been used to represent and classify the vaccine adverse events cited in package inserts of FDA-licensed human vaccines in the USA. OAE is a biomedical ontology that logically defines and classifies various adverse events occurring after medical interventions. OAE has successfully been applied in several adverse event studies. The OAE ontological framework provides a platform for systematic representation and analysis of adverse events and of the factors (e

  10. Forensic hydro-meteorological analysis of an extreme flash flood: The 2016-05-29 event in Braunsbach, SW Germany.

    PubMed

    Bronstert, Axel; Agarwal, Ankit; Boessenkool, Berry; Crisologo, Irene; Fischer, Madlen; Heistermann, Maik; Köhn-Reich, Lisei; López-Tarazón, José Andrés; Moran, Thomas; Ozturk, Ugur; Reinhardt-Imjela, Christian; Wendi, Dadiyorto

    2018-07-15

    The flash-flood in Braunsbach in the north-eastern part of Baden-Wuerttemberg/Germany was a particularly strong and concise event which took place during the floods in southern Germany at the end of May/early June 2016. This article presents a detailed analysis of the hydro-meteorological forcing and the hydrological consequences of this event. A specific approach, the "forensic hydrological analysis" was followed in order to include and combine retrospectively a variety of data from different disciplines. Such an approach investigates the origins, mechanisms and course of such natural events if possible in a "near real time" mode, in order to follow the most recent traces of the event. The results show that it was a very rare rainfall event with extreme intensities which, in combination with catchment properties, led to extreme runoff plus severe geomorphological hazards, i.e. great debris flows, which together resulted in immense damage in this small rural town Braunsbach. It was definitely a record-breaking event and greatly exceeded existing design guidelines for extreme flood discharge for this region, i.e. by a factor of about 10. Being such a rare or even unique event, it is not reliably feasible to put it into a crisp probabilistic context. However, one can conclude that a return period clearly above 100years can be assigned for all event components: rainfall, peak discharge and sediment transport. Due to the complex and interacting processes, no single flood cause or reason for the very high damage can be identified, since only the interplay and the cascading characteristics of those led to such an event. The roles of different human activities on the origin and/or intensification of such an extreme event are finally discussed. Copyright © 2018. Published by Elsevier B.V.

  11. Do event horizons exist?

    NASA Astrophysics Data System (ADS)

    Baccetti, Valentina; Mann, Robert B.; Terno, Daniel R.

    Event horizons are the defining feature of classical black holes. They are the key ingredient of the information loss paradox which, as paradoxes in quantum foundations, is built on a combination of predictions of quantum theory and counterfactual classical features: neither horizon formation nor its crossing by a test body can be detected by a distant observer. Furthermore, horizons are unnecessary for the production of Hawking-like radiation. We demonstrate that when this radiation is taken into account, it can prevent horizon crossing/formation in a large class of models. We conjecture that horizon avoidance is a general feature of collapse. The nonexistence of event horizons dispels the paradox, but opens up important questions about thermodynamic properties of the resulting objects and correlations between different degrees of freedom.

  12. A meta-analysis of the risk of total cardiovascular events of isosmolar iodixanol compared with low-osmolar contrast media.

    PubMed

    Zhang, Bu-Chun; Wu, Qiang; Wang, Cheng; Li, Dong-Ye; Wang, Zhi-Rong

    2014-04-01

    The iso-osmolar contrast agent iodixanol may be associated with a lower incidence of cardiac events than low-osmolar contrast media (LOCM), but previous trials have yielded mixed results. To compare the risk of total cardiovascular events of the iso-osmolar contrast medium, iodixanol, to LOCM. Medical literature databases were searched to identify comparisons between iodixanol and LOCM with cardiovascular events as a primary endpoint. A random-effects model was used to obtain pooled odds ratio (OR) for within-hospital and 30-day events. A total of 2 prospective cross-sectional studies and 11 randomized controlled trials (RCTs) (covering 6859 subjects) met our criteria. There was no significant difference in the incidence of within-hospital and 30-day cardiovascular events when iodixanol was compared with LOCM, with pooled OR of 0.72 (95%CI 0.49-1.06, p=0.09) and 1.19 (95%CI 0.70-2.02, p=0.53), respectively. Subgroup analysis showed no relative difference when iodixanol was compared with ioxaglate (OR=0.92, 95%CI 0.50-1.70, p=0.80) and iohexol (OR=0.75, 95%CI 0.48-1.17, p=0.21). However, a reduction in the within-hospital cardiovascular events was observed when iodixanol was compared with LOCM in the RCT subgroup (OR=0.65, 95%CI 0.44-0.96, p=0.03). Sensitivity analyses revealed that three studies had a strong impact on the association of within-hospital cardiovascular events between iodixanol and LOCM. Meta-regression analysis failed to account for heterogeneity. No publication bias was detected. This meta-analysis demonstrates that there is no conclusive evidence that iodixanol is superior to LOCM overall with regard to fewer cardiovascular events. Copyright © 2014. Published by Elsevier Ltd.

  13. Analysis of Cumulus Solar Irradiance Reflectance (CSIR) Events

    NASA Technical Reports Server (NTRS)

    Laird, John L.; Harshvardham

    1996-01-01

    Clouds are extremely important with regard to the transfer of solar radiation at the earth's surface. This study investigates Cumulus Solar Irradiance Reflection (CSIR) using ground-based pyranometers. CSIR events are short-term increases in solar radiation observed at the surface as a result of reflection off the sides of convective clouds. When sun-cloud observer geometry is favorable, these occurrences produce characteristic spikes in the pyranometer traces and solar irradiance values may exceed expected clear-sky values. Ultraviolet CSIR events were investigated during the summer of 1995 using Yankee Environmental Systems UVA-1 and UVB-1 pyranometers. Observed data were compared to clear-sky curves which were generated using a third degree polynomial best-fit line technique. Periods during which the observed data exceeded this clear-sky curve were identified as CSIR events. The magnitude of a CSIR event was determined by two different quantitative calculations. The MAC (magnitude above clear-sky) is an absolute measure of the difference between the observed and clear-sky irradiances. Maximum MAC values of 3.4 Wm(exp -2) and 0.069 Wm(exp -2) were observed at the UV-A and UV-B wavelengths, respectively. The second calculation determined the percentage above clear-sky (PAC) which indicated the relative magnitude of a CSIR event. Maximum UV-A and UV-B PAC magnitudes of 10.1% and 7.8%, respectively, were observed during the study. Also of interest was the duration of the CSIR events which is a function of sun-cloud-sensor geometry and the speed of cloud propagation over the measuring site. In both the UV-A and UV-B wavelengths, significant CSIR durations of up to 30 minutes were observed.

  14. Statistical analysis of hydrodynamic cavitation events

    NASA Astrophysics Data System (ADS)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  15. Constraining shallow seismic event depth via synthetic modeling for Expert Technical Analysis at the IDC

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.; Bobrov, D.; Friberg, P. A.

    2015-12-01

    Depth of event is an important criterion of seismic event screening at the International Data Center, CTBTO. However, a thorough determination of the event depth can be conducted mostly through special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface. When the true origin depth is greater than that reasonable for a nuclear test (3 km based on existing observations), this may result in a heavier workload to manually distinguish between shallow and deep events. Also, IDC depth criterion is not applicable to the events with the small t(pP-P) travel time difference, which is the case of the nuclear test. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the presence of the depth phase, cross correlation between observed and theoretic seismogram can provide an estimate for the depth of the event, and so provide an expansion to the screening process. We exercised this approach mostly with events at teleseismic and partially regional distances. We found that such approach can be very efficient for the seismic event screening process, with certain caveats related mostly to the poorly defined crustal models at source and receiver which can shift the depth estimate. We used adjustable t* teleseismic attenuation model for synthetics since this characteristic is not determined for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with Robert Herrmann's hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to the complex source topography.

  16. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  17. Subjective Well-Being and Adaptation to Life Events: A Meta-Analysis on Differences Between Cognitive and Affective Well-Being

    PubMed Central

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E.

    2012-01-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on cognitive and affective well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to four family events (marriage, divorce, bereavement, child birth) and four work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being, and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843

  18. Replica analysis of overfitting in regression models for time-to-event data

    NASA Astrophysics Data System (ADS)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  19. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  20. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    NASA Astrophysics Data System (ADS)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  1. Transcriptome Bioinformatical Analysis of Vertebrate Stages of Schistosoma japonicum Reveals Alternative Splicing Events

    PubMed Central

    Wang, Xinye; Xu, Xindong; Lu, Xingyu; Zhang, Yuanbin; Pan, Weiqing

    2015-01-01

    Alternative splicing is a molecular process that contributes greatly to the diversification of proteome and to gene functions. Understanding the mechanisms of stage-specific alternative splicing can provide a better understanding of the development of eukaryotes and the functions of different genes. Schistosoma japonicum is an infectious blood-dwelling trematode with a complex lifecycle that causes the tropical disease schistosomiasis. In this study, we analyzed the transcriptome of Schistosoma japonicum to discover alternative splicing events in this parasite, by applying RNA-seq to cDNA library of adults and schistosomula. Results were validated by RT-PCR and sequencing. We found 11,623 alternative splicing events among 7,099 protein encoding genes and average proportion of alternative splicing events per gene was 42.14%. We showed that exon skip is the most common type of alternative splicing events as found in high eukaryotes, whereas intron retention is the least common alternative splicing type. According to intron boundary analysis, the parasite possesses same intron boundaries as other organisms, namely the classic “GT-AG” rule. And in alternative spliced introns or exons, this rule is less strict. And we have attempted to detect alternative splicing events in genes encoding proteins with signal peptides and transmembrane helices, suggesting that alternative splicing could change subcellular locations of specific gene products. Our results indicate that alternative splicing is prevalent in this parasitic worm, and that the worm is close to its hosts. The revealed secretome involved in alternative splicing implies new perspective into understanding interaction between the parasite and its host. PMID:26407301

  2. The Braunsbach Flashflood of May 29, 2016: A forensic analysis of the meteorological origin and the hydrological development an extreme hydro-meteorological event

    NASA Astrophysics Data System (ADS)

    Bronstert, Axel; Ankit, Agarwal; Berry, Boessenkool; Madlen, Fischer; Maik, Heistermann; Lisei, Köhn-Reich; Thomas, Moran; Dadiyorto, Wendi

    2017-04-01

    The flash-flood at 29th May 2016 in the vicinity of the village of Braunsbach in Southwestern Germany, State of Baden-Wuerttemberg, has been a particularly concise event of the floods occurring in southern Germany at the end of May / early June 2016. This extreme event was triggered by a convective high intensity rain storm, causing extreme discharge rates and subsequent debris flow in the local creek. This led to severe flooding of the village with immense damages. Besides its extreme nature, the event is characterized by very local and short term scales, i.e. the catchment of the creek covers an area of only six km2 and the whole event lasted only two hours. This contribution presents a retrospective analysis with regard to meteorology and hydrology to obtain a quantitative assessment of the governing processes and their development. We term this a "forensic analysis" because due to the very local and sudden feature of this flashflood event, the processes cannot be directly measured during the event and/or at the site. Instead, they need to be reconstructed and estimated after the event from a variety of rather different information sources and "soft" data. Using these types of post event observations and analysis, we aim at obtaining a rather comprehensive picture of the event and its consequences. Regarding rainfall, both station data from the surroundings of the catchment and radar data from the German Weather Service were analyzed, including the analysis of different errors types and dynamic features of the convective system. The flood hydrograph, including the maximum discharge rate during the event, was estimated by three different approaches, which were compared to obtain an idea of the associated uncertainty. The overall results of this forensic analysis show that it was a very rare rainfall event with extreme rainfall intensities, e.g. return period exceeding 100 years. Catalyzed by catchment properties, this lead to extreme runoff, severe soil erosion

  3. Mediators of the relationship between social anxiety and post-event rumination.

    PubMed

    Chen, Junwen; Rapee, Ronald M; Abbott, Maree J

    2013-01-01

    A variety of cognitive and attentional factors are hypothesised to be associated with post-event rumination, a key construct that has been proposed to contribute to the maintenance of social anxiety disorder (SAD). The present study aimed to explore factors contributing to post-event rumination following delivery of a speech in a clinical population. 121 participants with SAD completed measures of trait social anxiety a week before they undertook a speech task. After the speech, participants answered several questionnaires assessing their state anxiety, self-evaluation of performance, perceived focus of attention and probability and cost of expected negative evaluation. One-week later, participants completed measures of negative rumination experienced over the week. Results showed two pathways leading to post-event rumination: (1) a direct path from trait social anxiety to post-event rumination and (2) indirect paths from trait social anxiety to post-event rumination via its relationships with inappropriate attentional focus and self-evaluation of performance. The results suggest that post event rumination is at least partly predicted by the extent to which socially anxious individuals negatively perceive their own performance and their allocation of attentional resources to this negative self-image. Current findings support the key relationships among cognitive processes proposed by cognitive models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine

  5. Painful and provocative events scale and fearlessness about death among Veterans: Exploratory factor analysis.

    PubMed

    Poindexter, Erin K; Nazem, Sarra; Forster, Jeri E

    2017-01-15

    The interpersonal theory of suicide suggests three proximal risk factors for suicide: perceived burdensomeness, thwarted belongingness, and acquired capability. Previous literature indicates that repetitive exposure to painful and provocative events is related to increased acquired capability for suicide. Despite this, research related to the assessment of painful and provocative events has been insufficient. Research has inconsistently administered the Painful and Provocative Events Scale (PPES; a painful and provocative events assessment), and no study has examined the factor structure of the English PPES. This study explored the factor structure of the PPES and the relation between factors and fearlessness about death. The sample was a cross-sectional, self-report study comprised of 119 Veterans (Mage = 46.5, SD = 13.5). Findings from an exploratory factor analysis indicated a four-factor solution for the PPES; however, no factor from the PPES significantly related to fearlessness about death (measured by the Acquired Capability for Suicide Scale - Fearlessness About Death Scale; all p >.21). Cross-sectional, small Veteran sample. Findings suggest that the PPES lacks the psychometric properties necessary to reliably investigate painful and provocative factors. Consequently, this measure may not reliably capture and explain how painful and provocative events relate to fearlessness about death, which is a barrier to improving suicide risk assessment and prediction. Recommendations for the construction of a new PPES are offered. Published by Elsevier B.V.

  6. MULTI-SPACECRAFT ANALYSIS OF ENERGETIC HEAVY ION AND INTERPLANETARY SHOCK PROPERTIES IN ENERGETIC STORM PARTICLE EVENTS NEAR 1 au

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebert, R. W.; Dayeh, M. A.; Desai, M. I.

    2016-11-10

    We examine the longitude distribution of and relationship between interplanetary (IP) shock properties and ∼0.1–20 MeV nucleon{sup -1} O and Fe ions during seven multi-spacecraft energetic storm particle (ESP) events at 1 au. These ESP events were observed at two spacecraft and were primarily associated with low Mach number, quasi-perpendicular shocks. Key observations include the following: (i) the Alfvén Mach number increased from east to west of the coronal mass ejection source longitude, while the shock speed, compression ratios, and obliquity showed no clear dependence; (ii) the O and Fe time intensity profiles and peak intensities varied significantly between longitudinallymore » separated spacecraft observing the same event, the peak intensities being larger near the nose and smaller along the flank of the IP shock; (iii) the O and Fe peak intensities had weak to no correlations with the shock parameters; (iv) the Fe/O time profiles showed intra-event variations upstream of the shock that disappeared downstream of the shock, where values plateaued to those comparable to the mean Fe/O of solar cycle 23; (v) the O and Fe spectral index ranged from ∼1.0 to 3.4, the Fe spectra being softer in most events; and (vi) the observed spectral index was softer than the value predicted from the shock compression ratio in most events. We conclude that while the variations in IP shock properties may account for some variations in O and Fe properties within these multi-spacecraft events, detailed examination of the upstream seed population and IP turbulence, along with modeling, are required to fully characterize these observations.« less

  7. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  8. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  9. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  10. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    NASA Astrophysics Data System (ADS)

    Wouters, J.; Bouchet, F.

    2016-09-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.

  11. Deep learning based beat event detection in action movie franchises

    NASA Astrophysics Data System (ADS)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  12. Best Practices in Pulic Outreach Events

    NASA Astrophysics Data System (ADS)

    Cobb, Whitney; Buxner, Sanlyn; Shipp, Stephanie

    2015-11-01

    IntroductionEach year the National Aeronautics and Space Administration (NASA) sponsors public outreach events designed to increase student, educator, and general public engagement in its missions and goals. NASA SMD Education’s review of large-scale events, “Best Practices in Outreach Events,” highlighted planning and implementation best practices, which were used by the Dawn mission to strategize and implement its Ceres arrival celebration event, i C Ceres.BackgroundThe literature review focused on best identifying practices rising from evaluations of large-scale public outreach events. The following criteria guided the study:* Public, science-related events open to adults and children* Events that occurred during the last 5 years* Evaluations that included information on data collected from visitors and/or volunteers* Evaluations that specified the type of data collected, methodology, and associated resultsBest Practices: Planning and ImplementationThe literature review revealed key considerations for planning implement large-scale events. Best practices included can be pertinent for all event organizers and evaluators regardless of event size. A summary of related best practices is presented below.1) Advertise the event2) Use and advertise access to scientists* Attendees who reported an interaction with a science professional were 15% to 19% more likely to report positive learning impacts, (SFA, 2012, p. 24).3) Recruit scientists using findings such as:* High percentages of scientists (85% to 96%) from most events were interested in participating again (SFA, 2012).4) Ensure that the event is group and, particularly, child friendly5) Target specific event outcomesBest Practices Informing Real-world Planning, Implementation and EvaluationDawn mission’s collaborative design of a series of events, i C Ceres, including in-person, interactive events geared to families and live presentations, will be shared, with focus on the family event, and the evidence

  13. Trauma and recent life events in individuals at ultra high risk for psychosis: review and meta-analysis.

    PubMed

    Kraan, Tamar; Velthorst, Eva; Smit, Filip; de Haan, Lieuwe; van der Gaag, Mark

    2015-02-01

    Childhood trauma and recent life-events have been related to psychotic disorders. The aim of the present study was to examine whether childhood trauma and recent life-events are significantly more prevalent in patients at Ultra High Risk (UHR) of developing a psychotic disorder compared to healthy controls. A search of PsychInfo and Embase was conducted, relevant papers were reviewed, and three random-effects meta-analyses were performed. One meta-analysis assessed the prevalence rate of childhood trauma in UHR subjects and two meta-analyses were conducted to compare UHR subjects and healthy control subjects on the experience of childhood trauma and recent life-events. We found 12 studies on the prevalence of (childhood) trauma in UHR populations and 4 studies on recent life-events in UHR populations. We performed a meta-analysis on 6 studies (of which trauma prevalence rates were available) on childhood trauma in UHR populations, yielding a mean prevalence rate of 86.8% (95% CI 77%-93%). Childhood trauma was significantly more prevalent in UHR subjects compared to healthy control groups (Random effects Hedges' g=1.09; Z=4.60, p<.001). In contrast to our hypothesis, life-event rates were significantly lower in UHR subjects compared to healthy controls (Random effects Hedges' g=-0.53; Z=-2.36, p<.02). Our meta-analytic results illustrate that childhood trauma is highly prevalent among UHR subjects and that childhood trauma is related to UHR status. These results are in line with studies on childhood trauma in psychotic populations. In contrast to studies on recent life-events in psychotic populations, our results show that recent life-events are not associated with UHR status. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    NASA Astrophysics Data System (ADS)

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  15. Root cause analysis of serious adverse events among older patients in the Veterans Health Administration.

    PubMed

    Lee, Alexandra; Mills, Peter D; Neily, Julia; Hemphill, Robin R

    2014-06-01

    Preventable adverse events are more likely to occur among older patients because of the clinical complexity of their care. The Veterans Health Administration (VHA) National Center for Patient Safety (NCPS) stores data about serious adverse events when a root cause analysis (RCA) has been performed. A primary objective of this study was to describe the types of adverse events occurring among older patients (age > or = 65 years) in Department of Veterans Affairs (VA) hospitals. Secondary objectives were to determine the underlying reasons for the occurrence of these events and report on effective action plans that have been implemented in VA hospitals. In a retrospective, cross-sectional review, RCA reports were reviewed and outcomes reported using descriptive statistics for all VA hospitals that conducted an RCA for a serious geriatric adverse event from January 2010 to January 2011 that resulted in sustained injury or death. The search produced 325 RCA reports on VA patients (age > or = 65 years). Falls (34.8%), delays in diagnosis and/or treatment (11.7%), unexpected death (9.9%), and medication errors (9.0%) were the most commonly reported adverse events among older VA patients. Communication was the most common underlying reason for these events, representing 43.9% of reported root causes. Approximately 40% of implemented action plans were judged by local staff to be effective. The RCA process identified falls and communication as important themes in serious adverse events. Concrete actions, such as process standardization and changes to communication, were reported by teams to yield some improvement. However, fewer than half of the action plans were reported to be effective. Further research is needed to guide development and implementation of effective action plans.

  16. Analysis and Prediction of Exon Skipping Events from RNA-Seq with Sequence Information Using Rotation Forest.

    PubMed

    Du, Xiuquan; Hu, Changlin; Yao, Yu; Sun, Shiwei; Zhang, Yanping

    2017-12-12

    In bioinformatics, exon skipping (ES) event prediction is an essential part of alternative splicing (AS) event analysis. Although many methods have been developed to predict ES events, a solution has yet to be found. In this study, given the limitations of machine learning algorithms with RNA-Seq data or genome sequences, a new feature, called RS (RNA-seq and sequence) features, was constructed. These features include RNA-Seq features derived from the RNA-Seq data and sequence features derived from genome sequences. We propose a novel Rotation Forest classifier to predict ES events with the RS features (RotaF-RSES). To validate the efficacy of RotaF-RSES, a dataset from two human tissues was used, and RotaF-RSES achieved an accuracy of 98.4%, a specificity of 99.2%, a sensitivity of 94.1%, and an area under the curve (AUC) of 98.6%. When compared to the other available methods, the results indicate that RotaF-RSES is efficient and can predict ES events with RS features.

  17. Prediction of Intensity Change Subsequent to Concentric Eyewall Events

    NASA Astrophysics Data System (ADS)

    Mauk, Rachel Grant

    Concentric eyewall events have been documented numerous times in intense tropical cyclones over the last two decades. During a concentric eyewall event, an outer (secondary) eyewall forms around the inner (primary) eyewall. Improved instrumentation on aircraft and satellites greatly increases the likelihood of detecting an event. Despite the increased ability to detect such events, forecasts of intensity changes during and after these events remain poor. When concentric eyewall events occur near land, accurate intensity change predictions are especially critical to ensure proper emergency preparations and staging of recovery assets. A nineteen-year (1997-2015) database of concentric eyewall events is developed by analyzing microwave satellite imagery, aircraft- and land-based radar, and other published documents. Events are identified in both the North Atlantic and eastern North Pacific basins. TCs are categorized as single (1 event), serial (>= 2 events) and super-serial (>= 3 events). Key findings here include distinct spatial patterns for single and serial Atlantic TCs, a broad seasonal distribution for eastern North Pacific TCs, and apparent ENSO-related variability in both basins. The intensity change subsequent to the concentric eyewall event is calculated from the HURDAT2 database at time points relative to the start and to the end of the event. Intensity change is then categorized as Weaken (≤ -10 kt), Maintain (+/- 5 kt), and Strengthen (≥ 10 kt). Environmental conditions in which each event occurred are analyzed based on the SHIPS diagnostic files. Oceanic, dynamic, thermodynamic, and TC status predictors are selected for testing in a multiple discriminant analysis procedure to determine which variables successfully discriminate the intensity change category and the occurrence of additional concentric eyewall events. Intensity models are created for 12 h, 24 h, 36 h, and 48 h after the concentric eyewall events end. Leave-one-out cross validation is

  18. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  19. Seasonality and Disturbance Events in the Carbon Isotope Record of Pinus elliottii Tree Rings from Big Pine Key, Florida

    NASA Astrophysics Data System (ADS)

    Rebenack, C.; Anderson, W. T.; Cherubini, P.

    2012-12-01

    , and disturbance events, such as tropical cyclone impacts. Because slash pine growth is dependent on water availability, a chronology developed using carbon isotopes may provide greater insight into plant stress over time and ultimately may lead to better correlations with climate oscillations. The work presented here is the result of a carbon-isotope study of four slash pine trees located across a freshwater gradient on Big Pine Key, Florida. A site chronology has been developed by cross-dating the δ13C records for each of the trees. The tree located on the distal edge of the freshwater gradient shows an overall enriched isotopic signature over time compared to the trees growing over a deeper part of the local freshwater lens, indicating that these trees are sensitive to water stress. In addition, the carbon isotope data show seasonal stomatal activity in the trees and indicate the timing of two disturbance events.

  20. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  1. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    PubMed

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  2. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  3. Recollection-Dependent Memory for Event Duration in Large-Scale Spatial Navigation

    ERIC Educational Resources Information Center

    Brunec, Iva K.; Ozubko, Jason D.; Barense, Morgan D.; Moscovitch, Morris

    2017-01-01

    Time and space represent two key aspects of episodic memories, forming the spatiotemporal context of events in a sequence. Little is known, however, about how temporal information, such as the duration and the order of particular events, are encoded into memory, and if it matters whether the memory representation is based on recollection or…

  4. Final report on key comparison CCQM-K100: Analysis of copper in ethanol

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Kakoulides, Elias; Zhu, Yanbei; Jaehrling, Reinhard; Rienitz, Olaf; Saxby, David; Phukphatthanachai, Pranee; Yafa, Charun; Labarraque, Guillaume; Cankur, Oktay; Can, Süleyman Z.; Konopelko, Leonid A.; Kustikov, Yu A.; Caciano de Sena, Rodrigo; Marques Rodrigues, Janaina; Fonseca Sarmanho, Gabriel; Fortunato de Carvalho Rocha, Werickson; dos Reis, Lindomar Augusto

    2014-01-01

    The increase in renewable sources in the energy matrix of the countries is an effort to reduce dependency on crude oil and the environmental impacts associated with its use. In order to help overcome the lack of widely accepted quality standards for fuel ethanol and to guarantee its competitiveness in the international trade market, the NMIs have been working to develop certified reference materials for bio-fuels and measurement methods. Inorganic impurities such as Cu, Na and Fe may be present in the fuel ethanol and their presence is associated with corrosion and the formation of oxide deposits in some engine parts. The key comparison CCQM-K100 was carried out under the auspices of the Inorganic Analysis Working Group (IAWG) and the coordination of the National Institute of Metrology, Quality and Technology (INMETRO). The objective of this key comparison was to compare the measurement capabilities of the participants for the determination of Cu in fuel ethanol. Ten NMIs participated in this exercise and most of them used the isotopic dilution method for determining the amount of Cu. The median was chosen as key comparison reference value (KCRV). The assigned KCRV for the Cu content was 0.3589 µg/g with a combined standard uncertainty of 0.0014 µg/g. In general, there is a good agreement among the participants' results. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  5. Stressful life events during adolescence and risk for externalizing and internalizing psychopathology: a meta-analysis.

    PubMed

    March-Llanes, Jaume; Marqués-Feixa, Laia; Mezquita, Laura; Fañanás, Lourdes; Moya-Higueras, Jorge

    2017-12-01

    The main objective of the present research was to analyze the relations between stressful life events and the externalizing and internalizing spectra of psychopathology using meta-analytical procedures. After removing the duplicates, a total of 373 papers were found in a literature search using several bibliographic databases, such as the PsycINFO, Medline, Scopus, and Web of Science. Twenty-seven studies were selected for the meta-analytical analysis after applying different inclusion and exclusion criteria in different phases. The statistical procedure was performed using a random/mixed-effects model based on the correlations found in the studies. Significant positive correlations were found in cross-sectional and longitudinal studies. A transactional effect was then found in the present study. Stressful life events could be a cause, but also a consequence, of psychopathological spectra. The level of controllability of the life events did not affect the results. Special attention should be given to the usage of stressful life events in gene-environment interaction and correlation studies, and also for clinical purposes.

  6. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  7. A new approach to identify, classify and count drugrelated events

    PubMed Central

    Bürkle, Thomas; Müller, Fabian; Patapovas, Andrius; Sonst, Anja; Pfistermeister, Barbara; Plank-Kiegele, Bettina; Dormann, Harald; Maas, Renke

    2013-01-01

    Aims The incidence of clinical events related to medication errors and/or adverse drug reactions reported in the literature varies by a degree that cannot solely be explained by the clinical setting, the varying scrutiny of investigators or varying definitions of drug-related events. Our hypothesis was that the individual complexity of many clinical cases may pose relevant limitations for current definitions and algorithms used to identify, classify and count adverse drug-related events. Methods Based on clinical cases derived from an observational study we identified and classified common clinical problems that cannot be adequately characterized by the currently used definitions and algorithms. Results It appears that some key models currently used to describe the relation of medication errors (MEs), adverse drug reactions (ADRs) and adverse drug events (ADEs) can easily be misinterpreted or contain logical inconsistencies that limit their accurate use to all but the simplest clinical cases. A key limitation of current models is the inability to deal with complex interactions such as one drug causing two clinically distinct side effects or multiple drugs contributing to a single clinical event. Using a large set of clinical cases we developed a revised model of the interdependence between MEs, ADEs and ADRs and extended current event definitions when multiple medications cause multiple types of problems. We propose algorithms that may help to improve the identification, classification and counting of drug-related events. Conclusions The new model may help to overcome some of the limitations that complex clinical cases pose to current paper- or software-based drug therapy safety. PMID:24007453

  8. Integrative Analysis of DNA Methylation and Gene Expression Data Identifies EPAS1 as a Key Regulator of COPD

    PubMed Central

    Yoo, Seungyeul; Takikawa, Sachiko; Geraghty, Patrick; Argmann, Carmen; Campbell, Joshua; Lin, Luan; Huang, Tao; Tu, Zhidong; Feronjy, Robert; Spira, Avrum; Schadt, Eric E.; Powell, Charles A.; Zhu, Jun

    2015-01-01

    Chronic Obstructive Pulmonary Disease (COPD) is a complex disease. Genetic, epigenetic, and environmental factors are known to contribute to COPD risk and disease progression. Therefore we developed a systematic approach to identify key regulators of COPD that integrates genome-wide DNA methylation, gene expression, and phenotype data in lung tissue from COPD and control samples. Our integrative analysis identified 126 key regulators of COPD. We identified EPAS1 as the only key regulator whose downstream genes significantly overlapped with multiple genes sets associated with COPD disease severity. EPAS1 is distinct in comparison with other key regulators in terms of methylation profile and downstream target genes. Genes predicted to be regulated by EPAS1 were enriched for biological processes including signaling, cell communications, and system development. We confirmed that EPAS1 protein levels are lower in human COPD lung tissue compared to non-disease controls and that Epas1 gene expression is reduced in mice chronically exposed to cigarette smoke. As EPAS1 downstream genes were significantly enriched for hypoxia responsive genes in endothelial cells, we tested EPAS1 function in human endothelial cells. EPAS1 knockdown by siRNA in endothelial cells impacted genes that significantly overlapped with EPAS1 downstream genes in lung tissue including hypoxia responsive genes, and genes associated with emphysema severity. Our first integrative analysis of genome-wide DNA methylation and gene expression profiles illustrates that not only does DNA methylation play a ‘causal’ role in the molecular pathophysiology of COPD, but it can be leveraged to directly identify novel key mediators of this pathophysiology. PMID:25569234

  9. Integrative analysis of DNA methylation and gene expression data identifies EPAS1 as a key regulator of COPD.

    PubMed

    Yoo, Seungyeul; Takikawa, Sachiko; Geraghty, Patrick; Argmann, Carmen; Campbell, Joshua; Lin, Luan; Huang, Tao; Tu, Zhidong; Foronjy, Robert F; Feronjy, Robert; Spira, Avrum; Schadt, Eric E; Powell, Charles A; Zhu, Jun

    2015-01-01

    Chronic Obstructive Pulmonary Disease (COPD) is a complex disease. Genetic, epigenetic, and environmental factors are known to contribute to COPD risk and disease progression. Therefore we developed a systematic approach to identify key regulators of COPD that integrates genome-wide DNA methylation, gene expression, and phenotype data in lung tissue from COPD and control samples. Our integrative analysis identified 126 key regulators of COPD. We identified EPAS1 as the only key regulator whose downstream genes significantly overlapped with multiple genes sets associated with COPD disease severity. EPAS1 is distinct in comparison with other key regulators in terms of methylation profile and downstream target genes. Genes predicted to be regulated by EPAS1 were enriched for biological processes including signaling, cell communications, and system development. We confirmed that EPAS1 protein levels are lower in human COPD lung tissue compared to non-disease controls and that Epas1 gene expression is reduced in mice chronically exposed to cigarette smoke. As EPAS1 downstream genes were significantly enriched for hypoxia responsive genes in endothelial cells, we tested EPAS1 function in human endothelial cells. EPAS1 knockdown by siRNA in endothelial cells impacted genes that significantly overlapped with EPAS1 downstream genes in lung tissue including hypoxia responsive genes, and genes associated with emphysema severity. Our first integrative analysis of genome-wide DNA methylation and gene expression profiles illustrates that not only does DNA methylation play a 'causal' role in the molecular pathophysiology of COPD, but it can be leveraged to directly identify novel key mediators of this pathophysiology.

  10. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review.

    PubMed

    Viglizzo, E F; Jobbágy, E G; Ricard, M F; Paruelo, J M

    2016-08-15

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    PubMed

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  13. An Event History Analysis of Teacher Attrition: Salary, Teacher Tracking, and Socially Disadvantaged Schools

    ERIC Educational Resources Information Center

    Kelly, Sean

    2004-01-01

    In this event history analysis of the 1990-1991 Schools and Staffing Survey and the 1992 Teacher Follow-up Survey, a retrospective person-year database was constructed to examine teacher attrition over the course of the teaching career. Consistent with prior research, higher teacher salaries reduced attrition, but only slightly so. Teacher…

  14. Visual search of cyclic spatio-temporal events

    NASA Astrophysics Data System (ADS)

    Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire

    2018-05-01

    The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.

  15. Seasonality and Disturbance Events in the Carbon Isotope Record of Slash Pine (Pinus elliottii) Tree Rings from Big Pine Key, Florida

    NASA Astrophysics Data System (ADS)

    Rebenack, C.; Anderson, W. T.; Cherubini, P.

    2011-12-01

    , and disturbance events. Because slash pine growth is dependent on water availability, a chronology developed using carbon isotopes may provide greater insight into plant stress over time and ultimately may lead to better correlations with climate oscillations. The work presented here is the preliminary result of a carbon-isotope study of four slash pine trees from Big Pine Key, Florida. Initial δ13C data show seasonal stomatal activity in the trees and indicate the timing of possible disturbance events.

  16. Effect of Intravitreal Anti-Vascular Endothelial Growth Factor Therapy on the Risk of Arterial Thromboembolic Events: A Meta-Analysis

    PubMed Central

    Lu, Guo-Cai; Wei, Rui-Li

    2012-01-01

    Background Intravitreal anti-vascular endothelial growth factor (VEGF) monoclonal antibodies are used in ocular neovascular diseases. A consensus has emerged that intravenous anti-VEGF can increase the risk of arterial thromboembolic events. However, the role of intravitreal anti-VEGF in arterial thromboembolism is controversial. Therefore, we did a systematic review and meta-analysis to investigate the effects of intravitreal anti-VEGF on the risk of arterial thromboembolic events. Methods Electronic databases were searched to identify relevant randomized clinical trials comparing intravitreal anti-VEGF with controls. Criteria for inclusion in our meta-analysis included a study duration of no less than 12 months, the use of a randomized control group not receiving any intravitreal active agent, and the availability of outcome data for arterial thromboembolic events, myocardial infarction, cerebrovascular accidents, and vascular death. The risk ratios and 95% CIs were calculated using a fixed-effects or random-effects model, depending on the heterogeneity of the included studies. Results A total of 4942 patients with a variety of ocular neovascular diseases from 13 randomized controlled trials were identified and included for analysis. There was no significant difference between intravitreal anti-VEGF and control in the risk of all events, with risk ratios of 0.87 (95% CI, 0.64 to 1.19) for arterial thromboembolic events, 0.96 (95% CI, 0.55–1.68) for cerebrovascular accidents, 0.69 (95% CI 0.40–1.21) for myocardial infarctions, and 0.68 (95% CI, 0.37–1.27) for vascular death. Conclusions The strength evidence suggests that the intravitreal use of anti-VEGF antibodies is not associated with an increased risk of arterial thromboembolic events. PMID:22829940

  17. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  18. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    NASA Astrophysics Data System (ADS)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  19. Dealing With Major Life Events and Transitions: A Systematic Literature Review on and Occupational Analysis of Spirituality.

    PubMed

    Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter

    2016-01-01

    This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life event or transition. The researchers investigated studies that used narrative analysis or a phenomenological methodology related to the topic. Thematic analysis resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life event or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention. Copyright © 2016 by the American Occupational Therapy Association, Inc.

  20. Adverse drug events related to mood and emotion in paediatric patients treated for ADHD: A meta-analysis.

    PubMed

    Pozzi, Marco; Carnovale, Carla; Peeters, Gabriëlla G A M; Gentili, Marta; Antoniazzi, Stefania; Radice, Sonia; Clementi, Emilio; Nobile, Maria

    2018-05-22

    ADHD is frequently comorbid with anxiety and mood disorders, which may increase the severity of inattention and hyperactivity symptoms. Emotional symptoms (anxiety, irritability, mood lability) also affect patients without comorbidity or emerge as adverse drug events. The influence of ADHD drugs on emotional symptoms demands investigation to improve therapies. Systematic review of trials reporting adverse events in patients pharmacologically treated for ADHD. Meta-analysis of the occurrence of irritability, anxiety, apathy, reduced talk, sadness, crying, emotional lability, biting nails, staring, perseveration, euphoria. Meta-regression analysis. Forty-five trials were meta-analysed. The most frequently reported outcomes were irritability, anxiety, sadness, and apathy. Methylphenidates, especially immediate-release formulations, were most studied; amphetamines were half as studied and were predominantly mixed amphetamine salts. Reports on atomoxetine were scant. Meta-analysis showed that methylphenidates reduced the risk of irritability, anxiety, euphoria, whereas they worsened the risk of apathy and reduced talk; amphetamines worsened the risk of emotional lability. Factors influencing risks were study year and design, patients' sex and age, drug dose and release formulation. Possible discrepancy between adverse events as indicated in clinical trials and as summarised herein. Confounding due to the aggregation of drugs into groups; uninvestigated sources of bias; incomplete lists of adverse events; lack of observations on self-injury. Methylphenidates appeared safer than amphetamines, although younger patients and females may incur higher risks, especially with high-dose, immediate-release methylphenidates. Only atomoxetine holds a black-box warning, but amphetamines and methylphenidates also did not show a safe profile regarding mood and emotional symptoms. Copyright © 2018. Published by Elsevier B.V.

  1. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    PubMed

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  2. Analysis of brand personality to involve event involvement and loyalty: A case study of Jakarta Fashion Week 2017

    NASA Astrophysics Data System (ADS)

    Nasution, A. H.; Rachmawan, Y. A.

    2018-04-01

    Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.

  3. The INTIMATE event stratigraphy of the last glacial period

    NASA Astrophysics Data System (ADS)

    Olander Rasmussen, Sune; Svensson, Anders

    2015-04-01

    The North Atlantic INTIMATE (INtegration of Ice-core, MArine and TErrestrial records) group has previously recommended an Event Stratigraphy approach for the synchronisation of records of the Last Termination using the Greenland ice core records as the regional stratotypes. A key element of these protocols has been the formal definition of numbered Greenland Stadials (GS) and Greenland Interstadials (GI) within the past glacial period as the Greenland expressions of the characteristic Dansgaard-Oeschger events that represent cold and warm phases of the North Atlantic region, respectively. Using a recent synchronization of the NGRIP, GRIP, and GISP2 ice cores that allows the parallel analysis of all three records on a common time scale, we here present an extension of the GS/GI stratigraphic template to the entire glacial period. In addition to the well-known sequence of Dansgaard-Oeschger events that were first defined and numbered in the ice core records more than two decades ago, a number of short-lived climatic oscillations have been identified in the three synchronized records. Some of these events have been observed in other studies, but we here propose a consistent scheme for discriminating and naming all the significant climatic events of the last glacial period that are represented in the Greenland ice cores. In addition to presenting the updated event stratigraphy, we make a series of recommendations on how to refer to these periods in a way that promotes unambiguous comparison and correlation between different proxy records, providing a more secure basis for investigating the dynamics and fundamental causes of these climatic perturbations. The work presented is a part of a newly published paper in an INTIMATE special issue of Quaternary Science Reviews: Rasmussen et al., 'A stratigraphic framework for abrupt climatic changes during the Last Glacial period based on three synchronized Greenland ice-core records: refining and extending the INTIMATE event

  4. A cyber-event correlation framework and metrics

    NASA Astrophysics Data System (ADS)

    Kang, Myong H.; Mayfield, Terry

    2003-08-01

    In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.

  5. The identification of key genes and pathways in hepatocellular carcinoma by bioinformatics analysis of high-throughput data.

    PubMed

    Zhang, Chaoyang; Peng, Li; Zhang, Yaqin; Liu, Zhaoyang; Li, Wenling; Chen, Shilian; Li, Guancheng

    2017-06-01

    Liver cancer is a serious threat to public health and has fairly complicated pathogenesis. Therefore, the identification of key genes and pathways is of much importance for clarifying molecular mechanism of hepatocellular carcinoma (HCC) initiation and progression. HCC-associated gene expression dataset was downloaded from Gene Expression Omnibus database. Statistical software R was used for significance analysis of differentially expressed genes (DEGs) between liver cancer samples and normal samples. Gene Ontology (GO) term enrichment analysis and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway analysis, based on R software, were applied for the identification of pathways in which DEGs significantly enriched. Cytoscape software was for the construction of protein-protein interaction (PPI) network and module analysis to find the hub genes and key pathways. Finally, weighted correlation network analysis (WGCNA) was conducted to further screen critical gene modules with similar expression pattern and explore their biological significance. Significance analysis identified 1230 DEGs with fold change >2, including 632 significantly down-regulated DEGs and 598 significantly up-regulated DEGs. GO term enrichment analysis suggested that up-regulated DEG significantly enriched in immune response, cell adhesion, cell migration, type I interferon signaling pathway, and cell proliferation, and the down-regulated DEG mainly enriched in response to endoplasmic reticulum stress and endoplasmic reticulum unfolded protein response. KEGG pathway analysis found DEGs significantly enriched in five pathways including complement and coagulation cascades, focal adhesion, ECM-receptor interaction, antigen processing and presentation, and protein processing in endoplasmic reticulum. The top 10 hub genes in HCC were separately GMPS, ACACA, ALB, TGFB1, KRAS, ERBB2, BCL2, EGFR, STAT3, and CD8A, which resulted from PPI network. The top 3 gene interaction modules in PPI network enriched

  6. Investigation of 2-stage meta-analysis methods for joint longitudinal and time-to-event data through simulation and real data application.

    PubMed

    Sudell, Maria; Tudur Smith, Catrin; Gueyffier, François; Kolamunnage-Dona, Ruwanthi

    2018-04-15

    Joint modelling of longitudinal and time-to-event data is often preferred over separate longitudinal or time-to-event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time-to-event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta-analysis of joint model estimates from multiple studies. We propose a 2-stage method for meta-analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta-analyses of separate longitudinal or time-to-event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta-analytic setting where association exists between the longitudinal and time-to-event outcomes. Where evidence of association between longitudinal and time-to-event outcomes exists, results from joint models over standalone analyses should be pooled in 2-stage meta-analyses. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  7. Spitzer Parallax Observations of Long Duration Gaia Microlensing Events

    NASA Astrophysics Data System (ADS)

    Carey, Sean; Calchi-Novati, Sebastiano; Wyrzykowski, Lukasz; Kruszynska, Katarzyna; Gromadzki, Mariusz; Rybicki, Krzysztof

    2018-05-01

    We proposed to observe of order ten long duration (>100 day) microlensing events identified in Gaia survey data with the Spitzer Space Telescope. The long duration events are likely due to massive lenses, hence they could be isolated black holes. These observations could make defintive mass measurements for the first time of isolated stellar remanant black holes in our Galaxy. The Spitzer data provide a key component to making an umabiguous mass measurement by providing the microlensing parallax (as has been done for >500 event by Spitzer so far). The Gaia data is used for the detection of the events and measurement of the astrometric motion caused by the microlensing event. From the astrometric microlensing signature, the Einstein radius of the lens can be measured and combined with the microlensing parallax yields the lens mass and distance.

  8. [Key physical parameters of hawthorn leaf granules by stepwise regression analysis method].

    PubMed

    Jiang, Qie-Ying; Zeng, Rong-Gui; Li, Zhe; Luo, Juan; Zhao, Guo-Wei; Lv, Dan; Liao, Zheng-Gen

    2017-05-01

    The purpose of this study was to investigate the effect of key physical properties of hawthorn leaf granule on its dissolution behavior. Hawthorn leaves extract was utilized as a model drug. The extract was mixed with microcrystalline cellulose or starch with the same ratio by using different methods. Appropriate amount of lubricant and disintegrating agent was added into part of the mixed powder, and then the granules were prepared by using extrusion granulation and high shear granulation. The granules dissolution behavior was evaluated by using equilibrium dissolution quantity and dissolution rate constant of the hypericin as the indicators. Then the effect of physical properties on dissolution behavior was analyzed through the stepwise regression analysis method. The equilibrium dissolution quantity of hypericin and adsorption heat constant in hawthorn leaves were positively correlated with the monolayer adsorption capacity and negatively correlated with the moisture absorption rate constant. The dissolution rate constants were decreased with the increase of Hausner rate, monolayer adsorption capacity and adsorption heat constant, and were increased with the increase of Carr index and specific surface area. Adsorption heat constant, monolayer adsorption capacity, moisture absorption rate constant, Carr index and specific surface area were the key physical properties of hawthorn leaf granule to affect its dissolution behavior. Copyright© by the Chinese Pharmaceutical Association.

  9. LACIE analyst interpretation keys

    NASA Technical Reports Server (NTRS)

    Baron, J. G.; Payne, R. W.; Palmer, W. F. (Principal Investigator)

    1979-01-01

    Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.

  10. Decoy-state quantum key distribution with more than three types of photon intensity pulses

    NASA Astrophysics Data System (ADS)

    Chau, H. F.

    2018-04-01

    The decoy-state method closes source security loopholes in quantum key distribution (QKD) using a laser source. In this method, accurate estimates of the detection rates of vacuum and single-photon events plus the error rate of single-photon events are needed to give a good enough lower bound of the secret key rate. Nonetheless, the current estimation method for these detection and error rates, which uses three types of photon intensities, is accurate up to about 1 % relative error. Here I report an experimentally feasible way that greatly improves these estimates and hence increases the one-way key rate of the BB84 QKD protocol with unbiased bases selection by at least 20% on average in realistic settings. The major tricks are the use of more than three types of photon intensities plus the fact that estimating bounds of the above detection and error rates is numerically stable, although these bounds are related to the inversion of a high condition number matrix.

  11. Identification of the key regulating genes of diminished ovarian reserve (DOR) by network and gene ontology analysis.

    PubMed

    Pashaiasl, Maryam; Ebrahimi, Mansour; Ebrahimie, Esmaeil

    2016-09-01

    Diminished ovarian reserve (DOR) is one of the reasons for infertility that not only affects both older and young women. Ovarian reserve assessment can be used as a new prognostic tool for infertility treatment decision making. Here, up- and down-regulated gene expression profiles of granulosa cells were analysed to generate a putative interaction map of the involved genes. In addition, gene ontology (GO) analysis was used to get insight intol the biological processes and molecular functions of involved proteins in DOR. Eleven up-regulated genes and nine down-regulated genes were identified and assessed by constructing interaction networks based on their biological processes. PTGS2, CTGF, LHCGR, CITED, SOCS2, STAR and FSTL3 were the key nodes in the up-regulated networks, while the IGF2, AMH, GREM, and FOXC1 proteins were key in the down-regulated networks. MIRN101-1, MIRN153-1 and MIRN194-1 inhibited the expression of SOCS2, while CSH1 and BMP2 positively regulated IGF1 and IGF2. Ossification, ovarian follicle development, vasculogenesis, sequence-specific DNA binding transcription factor activity, and golgi apparatus are the major differential groups between up-regulated and down-regulated genes in DOR. Meta-analysis of publicly available transcriptomic data highlighted the high coexpression of CTGF, connective tissue growth factor, with the other key regulators of DOR. CTGF is involved in organ senescence and focal adhesion pathway according to GO analysis. These findings provide a comprehensive system biology based insight into the aetiology of DOR through network and gene ontology analyses.

  12. Paleo-event data standards for dendrochronology

    Treesearch

    Elaine Kennedy Sutherland; P. Brewer; W. Gross

    2017-01-01

    Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...

  13. Cold periods and coronary events: an analysis of populations worldwide

    PubMed Central

    Barnett, A.; Dobson, A.; McElduff, P.; Salomaa, V.; Kuulasmaa, K.; Sans, S.; t for

    2005-01-01

    Study objective: To investigate the association between cold periods and coronary events, and the extent to which climate, sex, age, and previous cardiac history increase risk during cold weather. Design: A hierarchical analyses of populations from the World Health Organisation's MONICA project. Setting: Twenty four populations from the WHO's MONICA project, a 21 country register made between 1980 and 1995. Patients: People aged 35–64 years who had a coronary event. Main results: Daily rates of coronary events were correlated with the average temperature over the current and previous three days. In cold periods, coronary event rates increased more in populations living in warm climates than in populations living in cold climates, where the increases were slight. The increase was greater in women than in men, especially in warm climates. On average, the odds for women having an event in the cold periods were 1.07 higher than the odds for men (95% posterior interval: 1.03 to 1.11). The effects of cold periods were similar in those with and without a history of a previous myocardial infarction. Conclusions: Rates of coronary events increased during comparatively cold periods, especially in warm climates. The smaller increases in colder climates suggest that some events in warmer climates are preventable. It is suggested that people living in warm climates, particularly women, should keep warm on cold days. PMID:15965137

  14. Latent profile analysis of regression-based norms demonstrates relationship of compounding MS symptom burden and negative work events.

    PubMed

    Frndak, Seth E; Smerbeck, Audrey M; Irwin, Lauren N; Drake, Allison S; Kordovski, Victoria M; Kunker, Katrina A; Khan, Anjum L; Benedict, Ralph H B

    2016-10-01

    We endeavored to clarify how distinct co-occurring symptoms relate to the presence of negative work events in employed multiple sclerosis (MS) patients. Latent profile analysis (LPA) was utilized to elucidate common disability patterns by isolating patient subpopulations. Samples of 272 employed MS patients and 209 healthy controls (HC) were administered neuroperformance tests of ambulation, hand dexterity, processing speed, and memory. Regression-based norms were created from the HC sample. LPA identified latent profiles using the regression-based z-scores. Finally, multinomial logistic regression tested for negative work event differences among the latent profiles. Four profiles were identified via LPA: a common profile (55%) characterized by slightly below average performance in all domains, a broadly low-performing profile (18%), a poor motor abilities profile with average cognition (17%), and a generally high-functioning profile (9%). Multinomial regression analysis revealed that the uniformly low-performing profile demonstrated a higher likelihood of reported negative work events. Employed MS patients with co-occurring motor, memory and processing speed impairments were most likely to report a negative work event, classifying them as uniquely at risk for job loss.

  15. Semiparametric Time-to-Event Modeling in the Presence of a Latent Progression Event

    PubMed Central

    Rice, John D.; Tsodikov, Alex

    2017-01-01

    Summary In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood–based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. PMID:27556886

  16. Semiparametric time-to-event modeling in the presence of a latent progression event.

    PubMed

    Rice, John D; Tsodikov, Alex

    2017-06-01

    In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. © 2016, The International Biometric Society.

  17. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    PubMed

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  18. A Summary of Some Discrete-Event System Control Problems

    NASA Astrophysics Data System (ADS)

    Rudie, Karen

    A summary of the area of control of discrete-event systems is given. In this research area, automata and formal language theory is used as a tool to model physical problems that arise in technological and industrial systems. The key ingredients to discrete-event control problems are a process that can be modeled by an automaton, events in that process that cannot be disabled or prevented from occurring, and a controlling agent that manipulates the events that can be disabled to guarantee that the process under control either generates all the strings in some prescribed language or as many strings as possible in some prescribed language. When multiple controlling agents act on a process, decentralized control problems arise. In decentralized discrete-event systems, it is presumed that the agents effecting control cannot each see all event occurrences. Partial observation leads to some problems that cannot be solved in polynomial time and some others that are not even decidable.

  19. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    NASA Astrophysics Data System (ADS)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the

  20. Life events in schizoaffective disorder: A systematic review.

    PubMed

    Vardaxi, Chrysoula Ch; Gonda, Xenia; Fountoulakis, Konstantinos N

    2018-02-01

    Life events play a central role in the development of psychiatric disorders and impact course and outcome. We present a systematic review of the literature on the relationship of life events with the onset and long-term course of schizoaffective disorder. MEDLINE was searched with the combination of the key words: 'life events' plus 'schizoaffective'. The PRISMA method was followed in the review process. From the identified 66 papers only 12 were considered to be of relevance to the current study and 6 more papers were identified by inspecting the reference lists of the identified papers. There are very few studies focusing on the role of life events in schizoaffective disorder indicating insufficient data concerning the relationship of life events with onset and long-term course of schizoaffective disorder. Reported effects are not generic but concern specific events like the loss of mother, and females seem to be more vulnerable. Patients with schizoaffective disorder manifest high rates of PTSD. The literature on life events with the development and course of schizoaffective disorder is limited and precludes solid conclusions. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remainsmore » mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.« less

  2. Incidence and economic burden of suspected adverse events and adverse event monitoring during AF therapy.

    PubMed

    Kim, M H; Lin, J; Hussein, M; Battleman, D

    2009-12-01

    Rhythm- and rate-control therapies are an essential part of atrial fibrillation (AF) management; however, the use of existing agents is often limited by the occurrence of adverse events. The aim of this study was to evaluate suspected adverse events and adverse event monitoring, and associated medical costs, in patients receiving AF rhythm-control and/or rate-control therapy. This retrospective cohort study used claims data from the Integrated Healthcare Information Systems National Managed Care Benchmark Database from 2002-2006. Patients hospitalized for AF (primary diagnosis), and who had at least 365 days' enrollment before and after the initial (index) AF hospitalization, were included in the analysis. Suspected AF therapy-related adverse events and function tests for adverse event monitoring were identified according to pre-specified diagnosis codes/procedures, and examined over the 12 months following discharge from the index hospitalization. Events/function tests had to have occurred within 90 days of a claim for AF therapy to be considered a suspected adverse event/adverse event monitoring. Of 4174 AF patients meeting the study criteria, 3323 received AF drugs; 428 received rhythm-control only (12.9%), 2130 rate-control only (64.1%), and 765 combined rhythm/rate-control therapy (23.0%). Overall, 50.1% of treated patients had a suspected adverse event and/or function test for adverse event monitoring (45.5% with rate-control, 53.5% with rhythm-control, and 61.2% with combined rhythm/rate-control). Suspected cardiovascular adverse events were the most common events (occurring in 36.1% of patients), followed by pulmonary (6.1%), and endocrine events (5.9%). Overall, suspected adverse events/function tests were associated with mean annual per-patient costs of $3089 ($1750 with rhythm-control, $2041 with rate control, and $6755 with combined rhythm/rate-control). As a retrospective analysis, the study is subject to potential selection bias, while its reliance on

  3. Image Analysis Algorithms for Immunohistochemical Assessment of Cell Death Events and Fibrosis in Tissue Sections

    PubMed Central

    Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan

    2009-01-01

    Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554

  4. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    NASA Astrophysics Data System (ADS)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  5. FLOCK cluster analysis of mast cell event clustering by high-sensitivity flow cytometry predicts systemic mastocytosis.

    PubMed

    Dorfman, David M; LaPlante, Charlotte D; Pozdnyakova, Olga; Li, Betty

    2015-11-01

    In our high-sensitivity flow cytometric approach for systemic mastocytosis (SM), we identified mast cell event clustering as a new diagnostic criterion for the disease. To objectively characterize mast cell gated event distributions, we performed cluster analysis using FLOCK, a computational approach to identify cell subsets in multidimensional flow cytometry data in an unbiased, automated fashion. FLOCK identified discrete mast cell populations in most cases of SM (56/75 [75%]) but only a minority of non-SM cases (17/124 [14%]). FLOCK-identified mast cell populations accounted for 2.46% of total cells on average in SM cases and 0.09% of total cells on average in non-SM cases (P < .0001) and were predictive of SM, with a sensitivity of 75%, a specificity of 86%, a positive predictive value of 76%, and a negative predictive value of 85%. FLOCK analysis provides useful diagnostic information for evaluating patients with suspected SM, and may be useful for the analysis of other hematopoietic neoplasms. Copyright© by the American Society for Clinical Pathology.

  6. Incidence of cardiovascular events and associated risk factors in kidney transplant patients: a competing risks survival analysis.

    PubMed

    Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa

    2017-03-07

    The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and

  7. Resilience in carbonate production despite three coral bleaching events in 5 years on an inshore patch reef in the Florida Keys.

    PubMed

    Manzello, Derek P; Enochs, Ian C; Kolodziej, Graham; Carlton, Renée; Valentino, Lauren

    2018-01-01

    The persistence of coral reef frameworks requires that calcium carbonate (CaCO 3 ) production by corals and other calcifiers outpaces CaCO 3 loss via physical, chemical, and biological erosion. Coral bleaching causes declines in CaCO 3 production, but this varies with bleaching severity and the species impacted. We conducted census-based CaCO 3 budget surveys using the established ReefBudget approach at Cheeca Rocks, an inshore patch reef in the Florida Keys, annually from 2012 to 2016. This site experienced warm-water bleaching in 2011, 2014, and 2015. In 2017, we obtained cores of the dominant calcifying coral at this site, Orbicella faveolata , to understand how calcification rates were impacted by bleaching and how they affected the reef-wide CaCO 3 budget. Bleaching depressed O. faveolata growth and the decline of this one species led to an overestimation of mean (± std. error) reef-wide CaCO 3 production by + 0.68 (± 0.167) to + 1.11 (± 0.236) kg m -2  year -1 when using the static ReefBudget coral growth inputs. During non-bleaching years, the ReefBudget inputs slightly underestimated gross production by - 0.10 (± 0.022) to - 0.43 (± 0.100) kg m -2  year -1 . Carbonate production declined after the first year of back-to-back bleaching in 2014, but then increased after 2015 to values greater than the initial surveys in 2012. Cheeca Rocks is an outlier in the Caribbean and Florida Keys in terms of coral cover, carbonate production, and abundance of O. faveolata , which is threatened under the Endangered Species Act. Given the resilience of this site to repeated bleaching events, it may deserve special management attention.

  8. 10-Year analysis of adverse event reports to the Food and Drug Administration for phosphodiesterase type-5 inhibitors.

    PubMed

    Lowe, Gregory; Costabile, Raymond A

    2012-01-01

    To ensure public safety all Food and Drug Administration (FDA)-approved medications undergo postapproval safety analysis. Phosphodiesterase type-5 inhibitors (PDE5-i) are generally regarded as safe and effective. We performed a nonindustry-sponsored analysis of FDA reports for sildenafil, tadalafil, and vardenafil to evaluate the reported cardiovascular and mortality events over the past 10 years. Summarized reports of adverse events (AEs) for each PDE5-i were requested from the Center for Drug Evaluation and Research within the FDA. These data are available under the Freedom of Information Act and document industry and nonindustry reports of AEs entered into the computerized system maintained by the Office of Surveillance and Epidemiology. The data were analyzed for the number of AE reports, number of objective cardiovascular events, and reported deaths. Overall, 14,818 AEs were reported for sildenafil. There were 1,824 (12.3%) reported deaths, and reports of cardiovascular AEs numbered 2,406 (16.2%). Tadalafil was associated with 5,548 AEs and 236 deaths were reported. Vardenafil was associated with 6,085 AEs and 121 reports of deaths. The percentage of reported severe cardiovascular disorders has stabilized at 10% to 15% of all AE reports for sildenafil and tadalafil and 5% to 10% for vardenafil. Only 10% of AE reports sent to the FDA for PDE5-i were from pharmaceutical manufacturers. Reports of deaths associated with PDE5-i remain around 5% of total reported events. Despite inherent limitations from evaluating FDA reports of AEs, it is important that these reports be reviewed outside pharmaceutical industry support in order to provide due diligence and transparency. Lowe G and Costabile RA. 10-year analysis of adverse event reports to the Food and Drug Administration for phosphodiesterase type-5 inhibitors. J Sex Med 2012;9:265-270. © 2011 International Society for Sexual Medicine.

  9. Photographic Analysis Technique for Assessing External Tank Foam Loss Events

    NASA Technical Reports Server (NTRS)

    Rieckhoff, T. J.; Covan, M.; OFarrell, J. M.

    2001-01-01

    A video camera and recorder were placed inside the solid rocket booster forward skirt in order to view foam loss events over an area on the external tank (ET) intertank surface. In this Technical Memorandum, a method of processing video images to allow rapid detection of permanent changes indicative of foam loss events on the ET surface was defined and applied to accurately count, categorize, and locate such events.

  10. 76 FR 70768 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0254] Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request for comment; correction. SUMMARY: This document corrects a notice appearing...

  11. Physically-based modelling of high magnitude torrent events with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the

  12. Slips of the Typewriter Key.

    ERIC Educational Resources Information Center

    Berg, Thomas

    2002-01-01

    Presents an analysis of 500 submorphemic slips of the typewriter key that escaped the notice of authors and other proofreaders and thereby made their way into the published records of scientific research. (Author/VWL)

  13. Adverse events and treatment failure leading to discontinuation of recently approved antipsychotic drugs in schizophrenia: A network meta-analysis.

    PubMed

    Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto

    2015-12-01

    Objective:We aimed to gather evidence of the discontinuation rates owing to adverse events or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-analysis and mixed treatment comparison meta analysis(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse events and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse events, whether in pairwise meta-analysis or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-analysis and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse events. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.

  14. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Florita, Anthony R; Clark, Kara; Gevorgian, Vahan

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distributionmore » fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.« less

  15. Optimizing graph-based patterns to extract biomedical events from the literature

    PubMed Central

    2015-01-01

    In BioNLP-ST 2013 We participated in the BioNLP 2013 shared tasks on event extraction. Our extraction method is based on the search for an approximate subgraph isomorphism between key context dependencies of events and graphs of input sentences. Our system was able to address both the GENIA (GE) task focusing on 13 molecular biology related event types and the Cancer Genetics (CG) task targeting a challenging group of 40 cancer biology related event types with varying arguments concerning 18 kinds of biological entities. In addition to adapting our system to the two tasks, we also attempted to integrate semantics into the graph matching scheme using a distributional similarity model for more events, and evaluated the event extraction impact of using paths of all possible lengths as key context dependencies beyond using only the shortest paths in our system. We achieved a 46.38% F-score in the CG task (ranking 3rd) and a 48.93% F-score in the GE task (ranking 4th). After BioNLP-ST 2013 We explored three ways to further extend our event extraction system in our previously published work: (1) We allow non-essential nodes to be skipped, and incorporated a node skipping penalty into the subgraph distance function of our approximate subgraph matching algorithm. (2) Instead of assigning a unified subgraph distance threshold to all patterns of an event type, we learned a customized threshold for each pattern. (3) We implemented the well-known Empirical Risk Minimization (ERM) principle to optimize the event pattern set by balancing prediction errors on training data against regularization. When evaluated on the official GE task test data, these extensions help to improve the extraction precision from 62% to 65%. However, the overall F-score stays equivalent to the previous performance due to a 1% drop in recall. PMID:26551594

  16. High/Scope Preschool Key Experiences: Language and Literacy. [with]Curriculum Videotape.

    ERIC Educational Resources Information Center

    Brinkman, Nancy A.

    During the preschool years, children experience great strides in their ability to use language. This booklet and companion videotape help teachers and parents recognize and support six High/Scope key experiences in language and literacy: (1) talking with others about personally meaningful experiences; (2) describing objects, events, and relations;…

  17. Predictability of rogue events.

    PubMed

    Birkholz, Simon; Brée, Carsten; Demircan, Ayhan; Steinmeyer, Günter

    2015-05-29

    Using experimental data from three different rogue wave supporting systems, determinism, and predictability of the underlying dynamics are evaluated with methods of nonlinear time series analysis. We included original records from the Draupner platform in the North Sea as well as time series from two optical systems in our analysis. One of the latter was measured in the infrared tail of optical fiber supercontinua, the other in the fluence profiles of multifilaments. All three data sets exhibit extreme-value statistics and exceed the significant wave height in the respective system by a factor larger than 2. Nonlinear time series analysis indicates a different degree of determinism in the systems. The optical fiber scenario is found to be driven by quantum noise whereas rogue waves emerge as a consequence of turbulence in the others. With the large number of rogue events observed in the multifilament system, we can systematically explore the predictability of such events in a turbulent system. We observe that rogue events do not necessarily appear without a warning, but are often preceded by a short phase of relative order. This surprising finding sheds some new light on the fascinating phenomenon of rogue waves.

  18. Key principles to guide development of consumer medicine information--content analysis of information design texts.

    PubMed

    Raynor, David K; Dickinson, David

    2009-04-01

    Effective written consumer medicines information is essential to support safe and effective medicine taking, but the wording and layout of currently provided materials do not meet patients' needs. To identify principles from the wider discipline of information design for use by health professionals when developing or assessing written drug information for patients. Six experts in information design nominated texts on best practice in information design applicable to consumer medicines information. A content analysis identified key principles that were tabulated to bring out key themes. Six texts that met the inclusion criteria, were identified, and content analysis indentified 4 themes: words, type, lines, and layout. Within these main themes, there were 24 subthemes. Selected principles relating to these subthemes were: use short familiar words, short sentences, and short headings that stand out from the text; use a conversational tone of voice, addressing the reader as "you"; use a large type size while retaining sufficient white space; use bullet points to organize lists; use unjustified text (ragged right) and bold, lower-case text for emphasis. Pictures or graphics do not necessarily improve a document. Applying the good information design principles identified to written consumer medicines information could support health professionals when developing and assessing drug information for patients.

  19. An Oracle-based event index for ATLAS

    NASA Astrophysics Data System (ADS)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  20. “Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data

    PubMed Central

    Zhang, Min; Davidian, Marie

    2008-01-01

    Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813

  1. Developing future precipitation events from historic events: An Amsterdam case study.

    NASA Astrophysics Data System (ADS)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  2. Broadband Array Analysis of the 2005 Episodic Tremor and Slip Event in Northern Cascadia

    NASA Astrophysics Data System (ADS)

    Wech, A.; Creager, K.; McCausland, W.; Frassetto, A.; Qamar, A.; Derosier, S.; Carmichael, J.; Malone, S.; Johnson, D.

    2005-12-01

    The region of Cascadia from the Olympic Mountains through southern Vancouver Island and down-dip of the subduction megathrust has repeatedly experienced episodes of slow slip. This episodic slip, which has been observed to take place over a period of two to several weeks, is accompanied by a seismic tremor signal. Based on the average recurrence interval of 14 months, the next episodic tremor and slip (ETS) event should occur within six weeks of mid-September, 2005. Indeed, it appears to have begun on September 3, as this abstract was being written. In order to record this anticipated event, we deployed an array of 11 three-component seismometers on the northern side of the Olympic Peninsula augmenting Pacific Northwest Seismographic Network stations as well as the first few EarthScope BigFoot stations and Plate Boundary Observatory borehole seismometers. This seismic array was comprised of six short-period and five broadband instruments with spacings of 500 m and 2200 m respectively. In conjunction with this Earthscope seismic deployment, we also installed a dense network of 29 temporary, continuous GPS stations across the entire Olympic Peninsula to integrate seismic and geodetic observations. One of the primary goals of this research is to utilize the broadband instrumentation in the array to investigate the possible correlation of low frequency energy with the rest of the tremor activity. ETS has been carefully investigated at high-frequency (seismic tremor at 2-6 Hz) and very low-frequency (slip occurring over weeks, observed by GPS). An important goal of this experiment is to investigate the possibility that the tremor generates intermediate, low-frequency signals. Preliminary analysis of short-period array recordings of the July, 2004 ETS event suggests that the tremor displays signs of lower-frequency energy (~0.5 Hz) correlated with its higher frequency activity. Our array should enable us to distinguish low- frequency signals originating in the direction

  3. Optimal attacks on qubit-based Quantum Key Recycling

    NASA Astrophysics Data System (ADS)

    Leermakers, Daan; Škorić, Boris

    2018-03-01

    Quantum Key Recycling (QKR) is a quantum cryptographic primitive that allows one to reuse keys in an unconditionally secure way. By removing the need to repeatedly generate new keys, it improves communication efficiency. Škorić and de Vries recently proposed a QKR scheme based on 8-state encoding (four bases). It does not require quantum computers for encryption/decryption but only single-qubit operations. We provide a missing ingredient in the security analysis of this scheme in the case of noisy channels: accurate upper bounds on the required amount of privacy amplification. We determine optimal attacks against the message and against the key, for 8-state encoding as well as 4-state and 6-state conjugate coding. We provide results in terms of min-entropy loss as well as accessible (Shannon) information. We show that the Shannon entropy analysis for 8-state encoding reduces to the analysis of quantum key distribution, whereas 4-state and 6-state suffer from additional leaks that make them less effective. From the optimal attacks we compute the required amount of privacy amplification and hence the achievable communication rate (useful information per qubit) of qubit-based QKR. Overall, 8-state encoding yields the highest communication rates.

  4. Analyzing phenological extreme events over the past five decades in Germany

    NASA Astrophysics Data System (ADS)

    Schleip, Christoph; Menzel, Annette; Estrella, Nicole; Graeser, Philipp

    2010-05-01

    As climate change may alter the frequency and intensity of extreme temperatures, we analysed whether warming of the last 5 decades has already changed the statistics of phenological extreme events. In this context, two extreme value statistical concepts are discussed and applied to existing phenological datasets of German Weather Service (DWD) in order to derive probabilities of occurrence for extreme early or late phenological events. We analyse four phenological groups; "begin of flowering, "leaf foliation", "fruit ripening" and "leaf colouring" as well as DWD indicator phases of the "phenological year". Additionally we put an emphasis on a between-species analysis; a comparison of differences in extreme onsets between three common northern conifers. Furthermore we conducted a within-species analysis with different phases of horse chestnut throughout a year. The first statistical approach fits data to a Gaussian model using traditional statistical techniques, and then analyses the extreme quantile. The key point of this approach is the adoption of an appropriate probability density function (PDF) to the observed data and the assessment of the PDF parameters change in time. The full analytical description in terms of the estimated PDF for defined time steps of the observation period allows probability assessments of extreme values for e.g. annual or decadal time steps. Related with this approach is the possibility of counting out the onsets which fall in our defined extreme percentiles. The estimation of the probability of extreme events on the basis of the whole data set is in contrast to analyses with the generalized extreme value distribution (GEV). The second approach deals with the extreme PDFs itself and fits the GEV distribution to annual minima of phenological series to provide useful estimates about return levels. For flowering and leaf unfolding phases exceptionally early extremes are seen since the mid 1980s and especially for the single years 1961

  5. Use of the Hadoop structured storage tools for the ATLAS EventIndex event catalogue

    NASA Astrophysics Data System (ADS)

    Favareto, A.

    2016-09-01

    The ATLAS experiment at the LHC collects billions of events each data-taking year, and processes them to make them available for physics analysis in several different formats. An even larger amount of events is in addition simulated according to physics and detector models and then reconstructed and analysed to be compared to real events. The EventIndex is a catalogue of all events in each production stage; it includes for each event a few identification parameters, some basic non-mutable information coming from the online system, and the references to the files that contain the event in each format (plus the internal pointers to the event within each file for quick retrieval). Each EventIndex record is logically simple but the system has to hold many tens of billions of records, all equally important. The Hadoop technology was selected at the start of the EventIndex project development in 2012 and proved to be robust and flexible to accommodate this kind of information; both the insertion and query response times are acceptable for the continuous and automatic operation that started in Spring 2015. This paper describes the EventIndex data input and organisation in Hadoop and explains the operational challenges that were overcome in order to achieve the expected performance.

  6. Using the event analysis of systemic teamwork (EAST) to explore conflicts between different road user groups when making right hand turns at urban intersections.

    PubMed

    Salmon, Paul M; Lenne, Michael G; Walker, Guy H; Stanton, Neville A; Filtness, Ashleigh

    2014-01-01

    Collisions between different types of road users at intersections form a substantial component of the road toll. This paper presents an analysis of driver, cyclist, motorcyclist and pedestrian behaviour at intersections that involved the application of an integrated suite of ergonomics methods, the Event Analysis of Systemic Teamwork (EAST) framework, to on-road study data. EAST was used to analyse behaviour at three intersections using data derived from an on-road study of driver, cyclist, motorcyclist and pedestrian behaviour. The analysis shows the differences in behaviour and cognition across the different road user groups and pinpoints instances where this may be creating conflicts between different road users. The role of intersection design in creating these differences in behaviour and resulting conflicts is discussed. It is concluded that currently intersections are not designed in a way that supports behaviour across the four forms of road user studied. Interventions designed to improve intersection safety are discussed. Practitioner Summary: Intersection safety currently represents a key road safety issue worldwide. This paper presents a novel application of a framework of ergonomics methods for studying differences in road user behaviour at intersections. The findings support development of interventions that consider all road users as opposed to one group in isolation.

  7. Characterization of high-intensity, long-duration continuous auroral activity (HILDCAA) events using recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Mendes, Odim; Oliveira Domingues, Margarete; Echer, Ezequiel; Hajra, Rajkumar; Everton Menconi, Varlei

    2017-08-01

    Considering the magnetic reconnection and the viscous interaction as the fundamental mechanisms for transfer particles and energy into the magnetosphere, we study the dynamical characteristics of auroral electrojet (AE) index during high-intensity, long-duration continuous auroral activity (HILDCAA) events, using a long-term geomagnetic database (1975-2012), and other distinct interplanetary conditions (geomagnetically quiet intervals, co-rotating interaction regions (CIRs)/high-speed streams (HSSs) not followed by HILDCAAs, and events of AE comprised in global intense geomagnetic disturbances). It is worth noting that we also study active but non-HILDCAA intervals. Examining the geomagnetic AE index, we apply a dynamics analysis composed of the phase space, the recurrence plot (RP), and the recurrence quantification analysis (RQA) methods. As a result, the quantification finds two distinct clusterings of the dynamical behaviours occurring in the interplanetary medium: one regarding a geomagnetically quiet condition regime and the other regarding an interplanetary activity regime. Furthermore, the HILDCAAs seem unique events regarding a visible, intense manifestations of interplanetary Alfvénic waves; however, they are similar to the other kinds of conditions regarding a dynamical signature (based on RQA), because it is involved in the same complex mechanism of generating geomagnetic disturbances. Also, by characterizing the proper conditions of transitions from quiescent conditions to weaker geomagnetic disturbances inside the magnetosphere and ionosphere system, the RQA method indicates clearly the two fundamental dynamics (geomagnetically quiet intervals and HILDCAA events) to be evaluated with magneto-hydrodynamics simulations to understand better the critical processes related to energy and particle transfer into the magnetosphere-ionosphere system. Finally, with this work, we have also reinforced the potential applicability of the RQA method for

  8. Investigation of 2‐stage meta‐analysis methods for joint longitudinal and time‐to‐event data through simulation and real data application

    PubMed Central

    Tudur Smith, Catrin; Gueyffier, François; Kolamunnage‐Dona, Ruwanthi

    2017-01-01

    Background Joint modelling of longitudinal and time‐to‐event data is often preferred over separate longitudinal or time‐to‐event analyses as it can account for study dropout, error in longitudinally measured covariates, and correlation between longitudinal and time‐to‐event outcomes. The joint modelling literature focuses mainly on the analysis of single studies with no methods currently available for the meta‐analysis of joint model estimates from multiple studies. Methods We propose a 2‐stage method for meta‐analysis of joint model estimates. These methods are applied to the INDANA dataset to combine joint model estimates of systolic blood pressure with time to death, time to myocardial infarction, and time to stroke. Results are compared to meta‐analyses of separate longitudinal or time‐to‐event models. A simulation study is conducted to contrast separate versus joint analyses over a range of scenarios. Results Using the real dataset, similar results were obtained by using the separate and joint analyses. However, the simulation study indicated a benefit of use of joint rather than separate methods in a meta‐analytic setting where association exists between the longitudinal and time‐to‐event outcomes. Conclusions Where evidence of association between longitudinal and time‐to‐event outcomes exists, results from joint models over standalone analyses should be pooled in 2‐stage meta‐analyses. PMID:29250814

  9. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  10. Community Connections for I-10: A TPCB Peer Exchange Event

    DOT National Transportation Integrated Search

    2018-03-13

    This report highlights key recommendations and noteworthy practices identified at Community Connections for I-10 held on March 13-14, 2018 in Baton Rouge, Louisiana. This event was sponsored by the Transportation Planning Capacity Building (TPC...

  11. The May 17, 2012 Solar Event: Back-Tracing Analysis and Flux Reconstruction with PAMELA

    NASA Technical Reports Server (NTRS)

    Bruno, A.; Adriani, O.; Barbarino, G. C.; Bazilevskaya, G. A.; Bellotti, R.; Boezio, M.; Bogomolov, E. A.; Bongi, M.; Bonvicini, V.; Bottai, S.; hide

    2016-01-01

    The PAMELA space experiment is providing first direct observations of Solar Energetic Particles (SEPs) with energies from about 80 MeV to several GeV in near-Earth orbit, bridging the low energy measurements by other spacecrafts and the Ground Level Enhancement (GLE) data by the worldwide network of neutron monitors. Its unique observational capabilities include the possibility of measuring the flux angular distribution and thus investigating possible anisotropies associated to SEP events. The analysis is supported by an accurate back-tracing simulation based on a realistic description of the Earth's magnetosphere, which is exploited to estimate the SEP energy spectra as a function of the asymptotic direction of arrival with respect to the Interplanetary Magnetic Field (IMF). In this work we report the results for the May 17, 2012 event.

  12. Uncovering undetected hypoglycemic events

    PubMed Central

    Unger, Jeff

    2012-01-01

    Hypoglycemia is the rate-limiting factor that often prevents patients with diabetes from safely and effectively achieving their glycemic goals. Recent studies have reported that severe hypoglycemia is associated with a significant increase in the adjusted risks of major macrovascular events, major microvascular events, and mortality. Minor hypoglycemic episodes can also have serious implications for patient health, psychological well being, and adherence to treatment regimens. Hypoglycemic events can impact the health economics of the patient, their employer, and third-party payers. Insulin treatment is a key predictor of hypoglycemia, with one large population-based study reporting an overall prevalence of 7.1% (type 1 diabetes mellitus) and 7.3% (type 2 diabetes mellitus) in insulin-treated patients, compared with 0.8% in patients with type 2 diabetes treated with an oral sulfonylurea. Patients with type 1 diabetes typically experience symptomatic hypoglycemia on average twice weekly and severe hypoglycemia once annually. The progressive loss of islet cell function in patients with type 2 diabetes results in a higher risk of both symptomatic and unrecognized hypoglycemia over time. Patients with diabetes who become hypoglycemic are also more susceptible to developing defective counter-regulation, also known as hypoglycemia awareness autonomic failure, which is life-threatening and must be aggressively addressed. In patients unable to recognize hypoglycemia symptoms, frequent home monitoring or use of continuous glucose sensors are critical. Primary care physicians play a key role in the prevention and management of hypoglycemia in patients with diabetes, particularly in those requiring intensive insulin therapy, yet physicians are often unaware of the multitude of consequences of hypoglycemia or how to deal with them. Careful monitoring, adherence to guidelines, and use of optimal treatment combinations are all important steps toward improving care in patients

  13. EGF receptor lysosomal degradation is delayed in the cells stimulated with EGF-Quantum dot bioconjugate but earlier key events of endocytic degradative pathway are similar to that of native EGF

    PubMed Central

    Leontieva, Ekaterina A.; Kornilova, Elena S.

    2017-01-01

    Quantum dots (QDs) complexed to ligands recognizing surface receptors undergoing internalization are an attractive tool for live cell imaging of ligand-receptor complexes behavior and for specific tracking of the cells of interest. However, conjugation of quasi-multivalent large QD-particle to monovalent small growth factors like EGF that bound their tyrosine-kinase receptors may affect key endocytic events tightly bound to signaling. Here, by means of confocal microscopy we have addressed the key endocytic events of lysosomal degradative pathway stimulated by native EGF or EGF-QD bioconjugate. We have demonstrated that the decrease in endosome number, increase in mean endosome integrated density and the pattern of EEA1 co-localization with EGF-EGFR complexes at early stages of endocytosis were similar for the both native and QD-conjugated ligands. In both cases enlarged hollow endosomes appeared after wortmannin treatment. This indicates that early endosomal fusions and their maturation proceed similar for both ligands. EGF-QD and native EGF similarly accumulated in juxtanuclear region, and live cell imaging of endosome motion revealed the behavior described elsewhere for microtubule-facilitated motility. Finally, EGF-QD and the receptor were found in lysosomes. However, degradation of receptor part of QD-EGF-EGFR-complex was delayed compared to native EGF, but not inhibited, while QDs fluorescence was detected in lysosomes even after 24 hours. Importantly, in HeLa and A549 cells the both ligands behaved similarly. We conclude that during endocytosis EGF-QD behaves as a neutral marker for degradative pathway up to lysosomal stage and can also be used as a long-term cell marker. PMID:28574831

  14. Automated Detection of Events of Scientific Interest

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.

  15. Analysis of Infrequent (Quasi-Decadal) Large Groundwater Recharge Events: A Case Study for Northern Utah, United States

    NASA Astrophysics Data System (ADS)

    Masbruch, M.; Rumsey, C.; Gangopadhyay, S.; Susong, D.; Pruitt, T.

    2015-12-01

    There has been a considerable amount of research linking climatic variability to hydrologic responses in arid and semi-arid regions such as the western United States. Although much effort has been spent to assess and predict changes in surface-water resources, little has been done to understand how climatic events and changes affect groundwater resources. This study focuses on quantifying the effects of large quasi-decadal groundwater recharge events on groundwater in the northern Utah portion of the Great Basin for the period 1960 to 2013. Groundwater-level monitoring data were analyzed with climatic data to characterize climatic conditions and frequency of these large recharge events. Using observed water-level changes and multivariate analysis, five large groundwater recharge events were identified within the study area and period, with a frequency of about 11 to 13 years. These events were generally characterized as having above-average annual precipitation and snow water equivalent and below-average seasonal temperatures, especially during the spring (April through June). Existing groundwater flow models for several basins within the study area were used to quantify changes in groundwater storage from these events. Simulated groundwater storage increases per basin from a single event ranged from about 115 Mm3 (93,000 acre-feet) to 205 Mm3 (166,000 acre-ft). Extrapolating these amounts over the entire northern Great Basin indicates that even a single large quasi-decadal recharge event could result in billions of cubic meters (millions of acre-feet) of groundwater recharge. Understanding the role of these large quasi-decadal recharge events in replenishing aquifers and sustaining water supplies is crucial for making informed water management decisions.

  16. Vaccine adverse event text mining system for extracting features from vaccine safety reports.

    PubMed

    Botsis, Taxiarchis; Buttolph, Thomas; Nguyen, Michael D; Winiecki, Scott; Woo, Emily Jane; Ball, Robert

    2012-01-01

    To develop and evaluate a text mining system for extracting key clinical features from vaccine adverse event reporting system (VAERS) narratives to aid in the automated review of adverse event reports. Based upon clinical significance to VAERS reviewing physicians, we defined the primary (diagnosis and cause of death) and secondary features (eg, symptoms) for extraction. We built a novel vaccine adverse event text mining (VaeTM) system based on a semantic text mining strategy. The performance of VaeTM was evaluated using a total of 300 VAERS reports in three sequential evaluations of 100 reports each. Moreover, we evaluated the VaeTM contribution to case classification; an information retrieval-based approach was used for the identification of anaphylaxis cases in a set of reports and was compared with two other methods: a dedicated text classifier and an online tool. The performance metrics of VaeTM were text mining metrics: recall, precision and F-measure. We also conducted a qualitative difference analysis and calculated sensitivity and specificity for classification of anaphylaxis cases based on the above three approaches. VaeTM performed best in extracting diagnosis, second level diagnosis, drug, vaccine, and lot number features (lenient F-measure in the third evaluation: 0.897, 0.817, 0.858, 0.874, and 0.914, respectively). In terms of case classification, high sensitivity was achieved (83.1%); this was equal and better compared to the text classifier (83.1%) and the online tool (40.7%), respectively. Our VaeTM implementation of a semantic text mining strategy shows promise in providing accurate and efficient extraction of key features from VAERS narratives.

  17. Drama in the Key Stage 3 English Framework. Key Stage 3: National Strategy.

    ERIC Educational Resources Information Center

    Department for Education and Skills, London (England).

    Effective drama teaching improves the following student skills: speaking and listening, reading and writing through developing thinking, communication skills, and critical analysis. Drama is part of young people's core curriculum entitlement in the United Kingdom. It is included in the English Curriculum Orders and in the Key Stage 3 Framework for…

  18. Kickoff to Conflict: A Sequence Analysis of Intra-State Conflict-Preceding Event Structures

    PubMed Central

    D'Orazio, Vito; Yonamine, James E.

    2015-01-01

    While many studies have suggested or assumed that the periods preceding the onset of intra-state conflict are similar across time and space, few have empirically tested this proposition. Using the Integrated Crisis Early Warning System's domestic event data in Asia from 1998–2010, we subject this proposition to empirical analysis. We code the similarity of government-rebel interactions in sequences preceding the onset of intra-state conflict to those preceding further periods of peace using three different metrics: Euclidean, Levenshtein, and mutual information. These scores are then used as predictors in a bivariate logistic regression to forecast whether we are likely to observe conflict in neither, one, or both of the states. We find that our model accurately classifies cases where both sequences precede peace, but struggles to distinguish between cases in which one sequence escalates to conflict and where both sequences escalate to conflict. These findings empirically suggest that generalizable patterns exist between event sequences that precede peace. PMID:25951105

  19. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  20. Pediatric emergency department census during major sporting events.

    PubMed

    Kim, Tommy Y; Barcega, Besh B; Denmark, T Kent

    2012-11-01

    Our study attempted to evaluate the effects of major sporting events on the census of a pediatric emergency department (ED) in the United States specifically related to the National Football League Super Bowl, National Basketball Association (NBA) Finals, and Major League Baseball World Series. We performed a retrospective data analysis of our pediatric ED census on the number of visits during major sporting events over a 5-year period. Data during the same period 1 week after the major sporting event were collected for comparison as the control. We evaluated the medians of 2-hour increments around the event start time. Subgroup analysis was performed for games involving the local sporting teams. Our results showed no significant difference in ED census during the sporting events, except in the post 6 to 8 hours of the NBA finals. Subgroup analysis of the Los Angeles Lakers showed the same significant findings in the post 6 to 8 hours of the NBA finals. No major difference in pediatric ED census is observed during the most major sporting events in the United States.

  1. Association between exogenous testosterone and cardiovascular events: an overview of systematic reviews.

    PubMed

    Onasanya, Oluwadamilola; Iyer, Geetha; Lucas, Eleanor; Lin, Dora; Singh, Sonal; Alexander, G Caleb

    2016-11-01

    Given the conflicting evidence regarding the association between exogenous testosterone and cardiovascular events, we systematically assessed published systematic reviews for evidence of the association between exogenous testosterone and cardiovascular events. We searched PubMed, MEDLINE, Embase, Cochrane Collaboration Clinical Trials, ClinicalTrials.gov, and the US Food and Drug Administration website for systematic reviews of randomised controlled trials published up to July 19, 2016. Two independent reviewers screened 954 full texts from 29 335 abstracts to identify systematic reviews of randomised controlled trials in which the cardiovascular effects of exogenous testosterone on men aged 18 years or older were examined. We extracted data for study characteristics, analytic methods, and key findings, and applied the AMSTAR (A Measurement Tool to Assess Systematic Reviews) checklist to assess methodological quality of each review. Our primary outcome measure was the direction and magnitude of association between exogenous testosterone and cardiovascular events. We identified seven reviews and meta-analyses, which had substantial clinical heterogeneity, differing statistical methods, and variable methodological quality and quality of data abstraction. AMSTAR scores ranged from 3 to 9 out of 11. Six systematic reviews that each included a meta-analysis showed no significant association between exogenous testosterone and cardiovascular events, with summary estimates ranging from 1·07 to 1·82 and imprecise confidence intervals. Two of these six meta-analyses showed increased risk in subgroup analyses of oral testosterone and men aged 65 years or older during their first treatment year. One meta-analysis showed a significant association between exogenous testosterone and cardiovascular events, in men aged 18 years or older generally, with a summary estimate of 1·54 (95% CI 1·09-2·18). Our optimal information size analysis showed that any randomised controlled

  2. The Value Added National Project. Technical Report: Primary 4. Value-Added Key Stage 1 to Key Stage 2.

    ERIC Educational Resources Information Center

    Tymms, Peter

    This is the fourth in a series of technical reports that have dealt with issues surrounding the possibility of national value-added systems for primary schools in England. The main focus has been on the relative progress made by students between the ends of Key Stage 1 (KS1) and Key Stage 2 (KS2). The analysis has indicated that the strength of…

  3. Learning from Adverse Events in Obstetrics: Is a Standardized Computer Tool an Effective Strategy for Root Cause Analysis?

    PubMed

    Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah

    2015-08-01

    Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.

  4. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  5. Classification and evaluation of the documentary-recorded storm events in the Annals of the Choson Dynasty (1392-1910), Korea

    NASA Astrophysics Data System (ADS)

    Yoo, Chulsang; Park, Minkyu; Kim, Hyeon Jun; Choi, Juhee; Sin, Jiye; Jun, Changhyun

    2015-01-01

    In this study, the analysis of documentary records on the storm events in the Annals of the Choson Dynasty, covering the entire period of 519 years from 1392 to 1910, was carried out. By applying various key words related to storm events, a total of 556 documentary records could be identified. The main objective of this study was to develop rules of classification for the documentary records on the storm events in the Annals of the Choson Dynasty. The results were also compared with the rainfall data of the traditional Korean rain gauge, named Chukwooki, which are available from 1777 to 1910 (about 130 years). The analysis is organized as follows. First, the frequency of the documents, their length, comments about the size of the inundated area, the number of casualties, the number of property losses, and the size of the countermeasures, etc. were considered to determine the magnitude of the events. To this end, rules of classification of the storm events are developed. Cases in which the word 'disaster' was used along with detailed information about the casualties and property damages, were classified as high-level storm events. The high-level storm events were additionally sub-categorized into catastrophic, extreme, and severe events. Second, by applying the developed rules of classification, a total of 326 events were identified as high-level storm events during the 519 years of the Choson Dynasty. Among these high-level storm events, only 19 events were then classified as the catastrophic ones, 106 events as the extreme ones, and 201 events as the severe ones. The mean return period of these storm events was found to be about 30 years for the catastrophic events, 5 years for the extreme events, and 2-3 years for the severe events. Third, the classification results were verified considering the records of the traditional Korean rain gauge; it was found that the catastrophic events are strongly distinguished from other events with a mean total rainfall and a

  6. Joint models for longitudinal and time-to-event data: a review of reporting quality with a view to meta-analysis.

    PubMed

    Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin

    2016-12-05

    Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.

  7. The Key Roles in the Informal Organization: A Network Analysis Perspective

    ERIC Educational Resources Information Center

    de Toni, Alberto F.; Nonino, Fabio

    2010-01-01

    Purpose: The purpose of this paper is to identify the key roles embedded in the informal organizational structure (informal networks) and to outline their contribution in the companies' performance. A major objective of the research is to find and characterize a new key informal role that synthesises problem solving, expertise, and accessibility…

  8. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  9. Key-value store with internal key-value storage interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Ting, Dennis P. J.

    A key-value store is provided having one or more key-value storage interfaces. A key-value store on at least one compute node comprises a memory for storing a plurality of key-value pairs; and an abstract storage interface comprising a software interface module that communicates with at least one persistent storage device providing a key-value interface for persistent storage of one or more of the plurality of key-value pairs, wherein the software interface module provides the one or more key-value pairs to the at least one persistent storage device in a key-value format. The abstract storage interface optionally processes one or moremore » batch operations on the plurality of key-value pairs. A distributed embodiment for a partitioned key-value store is also provided.« less

  10. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  11. Cost-effectiveness analysis of applying the Cholesterol and Recurrent Events (CARE) study protocol in Hong Kong.

    PubMed

    Chau, J; Cheung, B M; McGhee, S M; Lauder, I J; Lau, C P; Kumana, C R

    2001-12-01

    To determine the cost-effectiveness of secondary prevention with pravastatin in Hong Kong patients with coronary heart disease and average cholesterol levels. Cost-effectiveness analysis based on published results of the CARE study. Men and women post-myocardial infarction with average cholesterol levels. Cost-effectiveness analysis: cost per life saved, cost per fatal or non-fatal coronary event prevented, cost per procedure prevented, and cost per fatal or non-fatal stroke prevented. Cost-utility analysis: gross cost and net cost per quality-adjusted life year gained calculated using two alternative models. Cost per life saved or death prevented was HK$4,442,350 (non-discounted); cost per fatal or non-fatal cardiac event prevented HK$1,146,413; cost per procedure prevented HK$732,759; and cost per fatal or non-fatal stroke prevented HK$2,961,566. Net cost per quality adjusted life year gained was HK$73,218 and HK$65,280 non-discounted, respectively using the two alternative models. The results of this study can assist in prioritising the use of health care resources in Hong Kong but should be considered alongside the benefits and costs of alternative interventions for coronary heart disease.

  12. An analysis of chemical and meteorological characteristics of haze events in the Seoul metropolitan area during January 12-18, 2013

    NASA Astrophysics Data System (ADS)

    Koo, Youn-Seo; Yun, Hui-Young; Choi, Dae-Ryun; Han, Jin-Seok; Lee, Jae-Bum; Lim, Yong-Jae

    2018-04-01

    The chemical characteristics of secondary inorganic and carbonaceous aerosols as well as their formation mechanisms during the haze event of January 12-18, 2013, in the Seoul Metropolitan Area (SMA) were investigated using measurements at the Baengnyeong and Seoul supersites with data available from LIDAR, meteorology, and modeling. An extraordinary haze event that occurred in northern China during that period extended to the Korean Peninsula and initiated the haze event in the SMA. Local emissions of primary aerosol and gaseous precursors in the SMA then made the situation worse under adverse meteorological conditions. OM (Organic Matter) and SO42- were the major long-range transport (LRT) aerosols from the Beijing, Tianjin and Hebei province (BTH) area to the SMA during the initial stage of the haze event. The LRT of SO42- from the BTH area, which was detected at Baengnyeong Island, was mostly acidic, while in Seoul, it was fully neutralized to (NH4)2SO4. The SIAs (Secondary Inorganic Aerosols) consisting of 56.5% PM2.5 during the haze period were the major chemical species causing haze problems in the SMA. NO3- was the most dominant chemical species among the SIAs and was locally formed by a heavy burden of NOx emissions from mobile sources in the SMA. Carbonaceous aerosols of OM and EC (Elemental Carbon) in the SMA during the haze period consisted of 18.9% PM2.5, but secondary organic carbon (SOC) was not the key species inducing the haze event during the January episode in the SMA.

  13. P2Y12 Polymorphisms and the Risk of Adverse Clinical Events in Patients Treated with Clopidogrel: A Meta-Analysis.

    PubMed

    Zhao, Kun; Yang, Ming; Lu, Yanxia; Sun, Shusen; Li, Wei; Li, Xingang; Zhao, Zhigang

    2018-05-23

    Some studies have reported an association between P2Y12 gene polymorphisms and clopidogrel adverse outcomes with inconsistent results. We aimed to explore the relationship between P2Y12 polymorphisms and the risk of adverse clinical events in patients treated with clopidogrel through a meta-analysis. A systematic search of PubMed, Web of Science and the Cochrane Library was conducted. Retrieved articles were comprehensively reviewed and eligible studies were included, and the relevant data was extracted for this meta-analysis. All statistical tests were performed by the Review Manager 5.3 software. A total of 14 studies involving 8,698 patients were included. In the Han Chinese population, ischemic events were associated with P2Y12 T744C polymorphism in the CC vs TT+CT genetic model (OR=3.32, 95%CI=1.62-6.82, P =0.001), and the events were associated with P2Y12 C34T polymorphism in the TT+TC vs CC genetic model (OR=1.70, 95%CI=1.22-2.36, P =0.002). However, ischemic events were not related to P2Y12 G52T polymorphism (TT+TG vs GG: OR=1.13, 95%CI=0.76-1.68, P =0.56; TT vs GG+TG: OR=2.02, 95%CI=0.65-6.28, P =0.22). The associations between the P2Y12 polymorphism and ischemic events were not significant in T744C, G52T and C34T genotype for another subgroup of the Caucasian population ( P >0.05). Only two studies referring to bleeding events were included in this analysis of C34T polymorphism, and no significant association was found (TT+TC vs CC: OR=1.07, 95%CI=0.37-3.15, P =0.90). In the Caucasian population, P2Y12 gene polymorphisms are not associated with clinical events. However, in the Chinese Han population, P2Y12 T744C and C34T polymorphisms are significantly associated with adverse clinical events. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Meta-analysis and psychophysiology: A tutorial using depression and action-monitoring event-related potentials.

    PubMed

    Moran, Tim P; Schroder, Hans S; Kneip, Chelsea; Moser, Jason S

    2017-01-01

    Meta-analyses are regularly used to quantitatively integrate the findings of a field, assess the consistency of an effect and make decisions based on extant research. The current article presents an overview and step-by-step tutorial of meta-analysis aimed at psychophysiological researchers. We also describe best-practices and steps that researchers can take to facilitate future meta-analysis in their sub-discipline. Lastly, we illustrate each of the steps by presenting a novel meta-analysis on the relationship between depression and action-monitoring event-related potentials - the error-related negativity (ERN) and the feedback negativity (FN). This meta-analysis found that the literature on depression and the ERN is contaminated by publication bias. With respect to the FN, the meta-analysis found that depression does predict the magnitude of the FN; however, this effect was dependent on the type of task used by the study. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Development and testing of an assessment instrument for the formative peer review of significant event analyses.

    PubMed

    McKay, J; Murphy, D J; Bowie, P; Schmuck, M-L; Lough, M; Eva, K W

    2007-04-01

    To establish the content validity and specific aspects of reliability for an assessment instrument designed to provide formative feedback to general practitioners (GPs) on the quality of their written analysis of a significant event. Content validity was quantified by application of a content validity index. Reliability testing involved a nested design, with 5 cells, each containing 4 assessors, rating 20 unique significant event analysis (SEA) reports (10 each from experienced GPs and GPs in training) using the assessment instrument. The variance attributable to each identified variable in the study was established by analysis of variance. Generalisability theory was then used to investigate the instrument's ability to discriminate among SEA reports. Content validity was demonstrated with at least 8 of 10 experts endorsing all 10 items of the assessment instrument. The overall G coefficient for the instrument was moderate to good (G>0.70), indicating that the instrument can provide consistent information on the standard achieved by the SEA report. There was moderate inter-rater reliability (G>0.60) when four raters were used to judge the quality of the SEA. This study provides the first steps towards validating an instrument that can provide educational feedback to GPs on their analysis of significant events. The key area identified to improve instrument reliability is variation among peer assessors in their assessment of SEA reports. Further validity and reliability testing should be carried out to provide GPs, their appraisers and contractual bodies with a validated feedback instrument on this aspect of the general practice quality agenda.

  16. Risk of Death in Infants Who Have Experienced a Brief Resolved Unexplained Event: A Meta-Analysis.

    PubMed

    Brand, Donald A; Fazzari, Melissa J

    2018-06-01

    To estimate an upper bound on the risk of death after a brief resolved unexplained event (BRUE), a sudden alteration in an infant's breathing, color, tone, or responsiveness, previously labeled "apparent life-threatening event" (ALTE). The meta-analysis incorporated observational studies of patients with ALTE that included data on in-hospital and post-discharge deaths with at least 1 week of follow-up after hospital discharge. Pertinent studies were identified from a published review of the literature from 1970 through 2014 and a supplementary PubMed query through February 2017. The 12 included studies (n = 3005) reported 12 deaths, of which 8 occurred within 4 months of the event. Applying a Poisson-normal random effects model to the 8 proximate deaths using a 4-month time horizon yielded a post-ALTE mortality rate of about 1 in 800, which constitutes an upper bound on the risk of death after a BRUE. This risk is about the same as the baseline risk of death during the first year of life. The meta-analysis therefore supports the return-home approach advocated in a recently published clinical practice guideline-not routine hospitalization-for BRUE patients who have been evaluated in the emergency department and determined to be at lower risk. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A systematic review and meta-analysis on herpes zoster and the risk of cardiac and cerebrovascular events.

    PubMed

    Erskine, Nathaniel; Tran, Hoang; Levin, Leonard; Ulbricht, Christine; Fingeroth, Joyce; Kiefe, Catarina; Goldberg, Robert J; Singh, Sonal

    2017-01-01

    Patients who develop herpes zoster or herpes zoster ophthalmicus may be at risk for cerebrovascular and cardiac complications. We systematically reviewed the published literature to determine the association between herpes zoster and its subtypes with the occurrence of cerebrovascular and cardiac events. Systematic searches of PubMed (MEDLINE), SCOPUS (Embase) and Google Scholar were performed in December 2016. Eligible studies were cohort, case-control, and self-controlled case-series examining the association between herpes zoster or subtypes of herpes zoster with the occurrence of cerebrovascular and cardiac events including stroke, transient ischemic attack, coronary heart disease, and myocardial infarction. Data on the occurrence of the examined events were abstracted. Odds ratios and their accompanying confidence intervals were estimated using random and fixed effects models with statistical heterogeneity estimated with the I2 statistic. Twelve studies examining 7.9 million patients up to 28 years after the onset of herpes zoster met our pre-defined eligibility criteria. Random and fixed effects meta-analyses showed that herpes zoster, type unspecified, and herpes zoster ophthalmicus were associated with a significantly increased risk of cerebrovascular events, without any evidence of statistical heterogeneity. Our meta-analysis also found a significantly increased risk of cardiac events associated with herpes zoster, type unspecified. Our results are consistent with the accumulating body of evidence that herpes zoster and herpes zoster ophthalmicus are significantly associated with cerebrovascular and cardiovascular events.

  18. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  19. Suspended solids transport: an analysis based on turbidity measurements and event based fully calibrated hydrodynamic models.

    PubMed

    Langeveld, J G; Veldkamp, R G; Clemens, F

    2005-01-01

    Modelling suspended solids transport is a key issue for predicting the pollution load discharged by CSOs. Nonetheless, there is still much debate on the main drivers for suspended solids transport and on the modelling approach to be adopted. Current sewer models provide suspended solids transport models. These models, however, rely upon erosion-deposition criteria developed in fluvial environments, therewith oversimplifying the sewer sediment characteristics. Consequently, the performance of these models is poor from a theoretical point of view. To get an improved understanding of the temporal and spatial variations in suspended solids transport, a measuring network was installed in the sewer system of Loenen in conjunction with a hydraulic measuring network from June through December 2001. During the measuring period, 15 storm events rendered high-quality data on both the hydraulics and the turbidity. For each storm event, a hydrodynamic model was calibrated using the Clemens' method. The conclusion of the paper is that modelling of suspended solids transport has been and will be one of the challenges in the field of urban drainage modelling. A direct relation of either shear stress or flow velocity with turbidity could not be found, likely because of the time varying characteristics of the suspended solids.

  20. The electrical heart axis and ST events in fetal monitoring: A post-hoc analysis following a multicentre randomised controlled trial.

    PubMed

    Vullings, Rik; Verdurmen, Kim M J; Hulsenboom, Alexandra D J; Scheffer, Stephanie; de Lau, Hinke; Kwee, Anneke; Wijn, Pieter F F; Amer-Wåhlin, Isis; van Laar, Judith O E H; Oei, S Guid

    2017-01-01

    Reducing perinatal morbidity and mortality is one of the major challenges in modern health care. Analysing the ST segment of the fetal electrocardiogram was thought to be the breakthrough in fetal monitoring during labour. However, its implementation in clinical practice yields many false alarms and ST monitoring is highly dependent on cardiotocogram assessment, limiting its value for the prediction of fetal distress during labour. This study aims to evaluate the relation between physiological variations in the orientation of the fetal electrical heart axis and the occurrence of ST events. A post-hoc analysis was performed following a multicentre randomised controlled trial, including 1097 patients from two participating centres. All women were monitored with ST analysis during labour. Cases of fetal metabolic acidosis, poor signal quality, missing blood gas analysis, and congenital heart disease were excluded. The orientation of the fetal electrical heart axis affects the height of the initial T/QRS baseline, and therefore the incidence of ST events. We grouped tracings with the same initial baseline T/QRS value. We depicted the number of ST events as a function of the initial baseline T/QRS value with a linear regression model. A significant increment of ST events was observed with increasing height of the initial T/QRS baseline, irrespective of the fetal condition; correlation coefficient 0.63, p<0.001. The most frequent T/QRS baseline is 0.12. The orientation of the fetal electrical heart axis and accordingly the height of the initial T/QRS baseline should be taken into account in fetal monitoring with ST analysis.

  1. Searching for Effective Training Solutions for Firefighting: The Analysis of Emergency Responses and Line of Duty Death Reports for Low Frequency, High Risk Events

    DTIC Science & Technology

    2017-09-01

    whether emergency incidents connected to low frequency and high risk events contain sufficient warning signs or indicators of imminent catastrophic... high risk events contain sufficient warning signs or indicators of imminent catastrophic events, if firefighters could identify them, and if there...EFFECTIVE TRAINING SOLUTIONS FOR FIREFIGHTING: THE ANALYSIS OF EMERGENCY RESPONSES AND LINE OF DUTY DEATH REPORTS FOR LOW FREQUENCY, HIGH RISK EVENTS

  2. Prediction of collision events: an EEG coherence analysis.

    PubMed

    Spapé, Michiel M; Serrien, Deborah J

    2011-05-01

    A common daily-life task is the interaction with moving objects for which prediction of collision events is required. To evaluate the sources of information used in this process, this EEG study required participants to judge whether two moving objects would collide with one another or not. In addition, the effect of a distractor object is evaluated. The measurements included the behavioural decision time and accuracy, eye movement fixation times, and the neural dynamics which was determined by means of EEG coherence, expressing functional connectivity between brain areas. Collision judgment involved widespread information processing across both hemispheres. When a distractor object was present, task-related activity was increased whereas distractor activity induced modulation of local sensory processing. Also relevant were the parietal regions communicating with bilateral occipital and midline areas and a left-sided sensorimotor circuit. Besides visual cues, cognitive and strategic strategies are used to establish a decision of events in time. When distracting information is introduced into the collision judgment process, it is managed at different processing levels and supported by distinct neural correlates. These data shed light on the processing mechanisms that support judgment of collision events; an ability that implicates higher-order decision-making. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Skylab ATM/S-056 X-ray event analyzer: Instrument description, parameter determination, and analysis example (15 June 1973 1B/M3 flare)

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.

    1976-01-01

    The Skylab ATM/S-056 X-Ray Event Analyzer, part of an X-ray telescope experiment, is described. The techniques employed in the analysis of its data to determine electron temperatures and emission measures are reviewed. The analysis of a sample event - the 15 June 1973 1B/M3 flare - is performed. Comparison of the X-Ray Event Analyzer data with that of the SolRad 9 observations indicates that the X-Ray Event Analyzer accurately monitored the sun's 2.5 to 7.25 A X-ray emission and to a lesser extent the 6.1 to 20 A emission. A mean average peak temperature of 15 million K at 1,412 UT and a mean average peak electron density (assuming a flare volume of 10 to the 13 power cu km) of 27 million/cu mm at 1,416 to 1,417 UT are deduced for the event. The X-Ray Event Analyzer data, having a 2.5 s time resolution, should be invaluable in comparisons with other high-time resolution data (e.g., radio bursts).

  4. The stressed eyewitness: the interaction of thematic arousal and post-event stress in memory for central and peripheral event information

    PubMed Central

    Echterhoff, Gerald; Wolf, Oliver T.

    2012-01-01

    Both arousal during the encoding of stimuli and subsequent stress can affect memory, often by increasing memory for important or central information. We explored whether event-based (thematic) arousal and post-event stress interact to selectively enhance eyewitnesses' memory for the central aspects of an observed incident. Specifically, we argue that memory for stimuli should be enhanced when (1) the stimuli are encoded under arousal (vs. non-arousal), and (2) stress is experienced soon after the encoding episode. We designed an experiment that extended previous research by manipulating arousal without changing the stimulus material, distinguishing between central and peripheral event information, and using a dynamic, life-like event instead of static pictures. After watching a video depicting a burglary under high or low thematic arousal, psychosocial stress was induced or not induced by the Trier Social Stress Test (TSST). Salivary cortisol was measured at standard intervals. Consistent with our prediction, we found a significant post-event stress × thematic arousal × centrality interaction, indicating that the recognition advantage for central event items over peripheral event items was most pronounced under both high thematic arousal and post-event stress. Because stress was induced after encoding this interaction cannot be explained by possible differences at encoding, such as narrowed attention. The centrality effect of post-event stress under high thematic arousal was statistically mediated by the cortisol increase, which suggests a key role of the stress hormone. We discuss implications of our findings for psychological and neuroscientific theories of emotional memory formation. PMID:22936900

  5. The stressed eyewitness: the interaction of thematic arousal and post-event stress in memory for central and peripheral event information.

    PubMed

    Echterhoff, Gerald; Wolf, Oliver T

    2012-01-01

    Both arousal during the encoding of stimuli and subsequent stress can affect memory, often by increasing memory for important or central information. We explored whether event-based (thematic) arousal and post-event stress interact to selectively enhance eyewitnesses' memory for the central aspects of an observed incident. Specifically, we argue that memory for stimuli should be enhanced when (1) the stimuli are encoded under arousal (vs. non-arousal), and (2) stress is experienced soon after the encoding episode. We designed an experiment that extended previous research by manipulating arousal without changing the stimulus material, distinguishing between central and peripheral event information, and using a dynamic, life-like event instead of static pictures. After watching a video depicting a burglary under high or low thematic arousal, psychosocial stress was induced or not induced by the Trier Social Stress Test (TSST). Salivary cortisol was measured at standard intervals. Consistent with our prediction, we found a significant post-event stress × thematic arousal × centrality interaction, indicating that the recognition advantage for central event items over peripheral event items was most pronounced under both high thematic arousal and post-event stress. Because stress was induced after encoding this interaction cannot be explained by possible differences at encoding, such as narrowed attention. The centrality effect of post-event stress under high thematic arousal was statistically mediated by the cortisol increase, which suggests a key role of the stress hormone. We discuss implications of our findings for psychological and neuroscientific theories of emotional memory formation.

  6. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    NASA Astrophysics Data System (ADS)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  7. Filtering large-scale event collections using a combination of supervised and unsupervised learning for event trigger classification.

    PubMed

    Mehryary, Farrokh; Kaewphan, Suwisa; Hakala, Kai; Ginter, Filip

    2016-01-01

    Biomedical event extraction is one of the key tasks in biomedical text mining, supporting various applications such as database curation and hypothesis generation. Several systems, some of which have been applied at a large scale, have been introduced to solve this task. Past studies have shown that the identification of the phrases describing biological processes, also known as trigger detection, is a crucial part of event extraction, and notable overall performance gains can be obtained by solely focusing on this sub-task. In this paper we propose a novel approach for filtering falsely identified triggers from large-scale event databases, thus improving the quality of knowledge extraction. Our method relies on state-of-the-art word embeddings, event statistics gathered from the whole biomedical literature, and both supervised and unsupervised machine learning techniques. We focus on EVEX, an event database covering the whole PubMed and PubMed Central Open Access literature containing more than 40 million extracted events. The top most frequent EVEX trigger words are hierarchically clustered, and the resulting cluster tree is pruned to identify words that can never act as triggers regardless of their context. For rarely occurring trigger words we introduce a supervised approach trained on the combination of trigger word classification produced by the unsupervised clustering method and manual annotation. The method is evaluated on the official test set of BioNLP Shared Task on Event Extraction. The evaluation shows that the method can be used to improve the performance of the state-of-the-art event extraction systems. This successful effort also translates into removing 1,338,075 of potentially incorrect events from EVEX, thus greatly improving the quality of the data. The method is not solely bound to the EVEX resource and can be thus used to improve the quality of any event extraction system or database. The data and source code for this work are available at

  8. A review for identification of initiating events in event tree development process on nuclear power plants

    NASA Astrophysics Data System (ADS)

    Riyadi, Eko H.

    2014-09-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  9. Extended analysis of the Trojan-horse attack in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Vinay, Scott E.; Kok, Pieter

    2018-04-01

    The discrete-variable quantum key distribution protocols based on the 1984 protocol of Bennett and Brassard (BB84) are known to be secure against an eavesdropper, Eve, intercepting the flying qubits and performing any quantum operation on them. However, these protocols may still be vulnerable to side-channel attacks. We investigate the Trojan-horse side-channel attack where Eve sends her own state into Alice's apparatus and measures the reflected state to estimate the key. We prove that the separable coherent state is optimal for Eve among the class of multimode Gaussian attack states, even in the presence of thermal noise. We then provide a bound on the secret key rate in the case where Eve may use any separable state.

  10. Florida Keys

    NASA Image and Video Library

    2002-12-13

    The Florida Keys are a chain of islands, islets and reefs extending from Virginia Key to the Dry Tortugas for about 309 kilometers (192 miles). The keys are chiefly limestone and coral formations. The larger islands of the group are Key West (with its airport), Key Largo, Sugarloaf Key, and Boca Chica Key. A causeway extends from the mainland to Key West. This image was acquired on October 28, 2001, by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on NASA's Terra satellite. With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet. http://photojournal.jpl.nasa.gov/catalog/PIA03890

  11. Seismic event near Jarocin (Poland)

    NASA Astrophysics Data System (ADS)

    Lizurek, Grzegorz; Plesiewicz, Beata; Wiejacz, Paweł; Wiszniowski, Jan; Trojanowski, Jacek

    2013-02-01

    The earthquake of magnitude M L = 3:8 (EMSC) took place on Friday, 6 January 2012, north-east of the town of Jarocin in Wielkopolska Region, Poland. The only historical information about past earthquakes in the region was found in the diary from 1824; apart of it, there was a seismic event noticed in the vicinity of Wielkopolska in 1606 (Pagaczewski 1982). The scope of this paper is to describe the 6 January 2012 event in view of instrumental seismology, macroseismic data analysis and known tectonics of the region, which should be useful in future seismic hazard analysis of Poland.

  12. Rain-on-snow Events in Southwestern British Columbia: A Long-term Analysis of Meteorological Conditions and Snowpack Response

    NASA Astrophysics Data System (ADS)

    Trubilowicz, J. W.; Moore, D.

    2015-12-01

    Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.

  13. Hypothetical scenario exercises to improve planning and readiness for drinking water quality management during extreme weather events.

    PubMed

    Deere, Daniel; Leusch, Frederic D L; Humpage, Andrew; Cunliffe, David; Khan, Stuart J

    2017-03-15

    Two hypothetical scenario exercises were designed and conducted to reflect the increasingly extreme weather-related challenges faced by water utilities as the global climate changes. The first event was based on an extreme flood scenario. The second scenario involved a combination of weather events, including a wild forest fire ('bushfire') followed by runoff due to significant rainfall. For each scenario, a panel of diverse personnel from water utilities and relevant agencies (e.g. health departments) formed a hypothetical water utility and associated regulatory body to manage water quality following the simulated extreme weather event. A larger audience participated by asking questions and contributing key insights. Participants were confronted with unanticipated developments as the simulated scenarios unfolded, introduced by a facilitator. Participants were presented with information that may have challenged their conventional experiences regarding operational procedures in order to identify limitations in current procedures, assumptions, and readily available information. The process worked toward the identification of a list of specific key lessons for each event. At the conclusion of each simulation a facilitated discussion was used to establish key lessons of value to water utilities in preparing them for similar future extreme events. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Counterfactual Quantum Deterministic Key Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng; Wang, Jian; Tang, Chao-Jing

    2013-01-01

    We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.

  15. Shift work and vascular events: systematic review and meta-analysis.

    PubMed

    Vyas, Manav V; Garg, Amit X; Iansavichus, Arthur V; Costella, John; Donner, Allan; Laugsand, Lars E; Janszky, Imre; Mrkobrada, Marko; Parraga, Grace; Hackam, Daniel G

    2012-07-26

    To synthesise the association of shift work with major vascular events as reported in the literature. Systematic searches of major bibliographic databases, contact with experts in the field, and review of reference lists of primary articles, review papers, and guidelines. Observational studies that reported risk ratios for vascular morbidity, vascular mortality, or all cause mortality in relation to shift work were included; control groups could be non-shift ("day") workers or the general population. Study quality was assessed with the Downs and Black scale for observational studies. The three primary outcomes were myocardial infarction, ischaemic stroke, and any coronary event. Heterogeneity was measured with the I(2) statistic and computed random effects models. 34 studies in 2,011,935 people were identified. Shift work was associated with myocardial infarction (risk ratio 1.23, 95% confidence interval 1.15 to 1.31; I(2)=0) and ischaemic stroke (1.05, 1.01 to 1.09; I(2)=0). Coronary events were also increased (risk ratio 1.24, 1.10 to 1.39), albeit with significant heterogeneity across studies (I(2)=85%). Pooled risk ratios were significant for both unadjusted analyses and analyses adjusted for risk factors. All shift work schedules with the exception of evening shifts were associated with a statistically higher risk of coronary events. Shift work was not associated with increased rates of mortality (whether vascular cause specific or overall). Presence or absence of adjustment for smoking and socioeconomic status was not a source of heterogeneity in the primary studies. 6598 myocardial infarctions, 17,359 coronary events, and 1854 ischaemic strokes occurred. On the basis of the Canadian prevalence of shift work of 32.8%, the population attributable risks related to shift work were 7.0% for myocardial infarction, 7.3% for all coronary events, and 1.6% for ischaemic stroke. Shift work is associated with vascular events, which may have implications for public

  16. Time-to-Event Analysis of Individual Variables Associated with Nursing Students' Academic Failure: A Longitudinal Study

    ERIC Educational Resources Information Center

    Dante, Angelo; Fabris, Stefano; Palese, Alvisa

    2013-01-01

    Empirical studies and conceptual frameworks presented in the extant literature offer a static imagining of academic failure. Time-to-event analysis, which captures the dynamism of individual factors, as when they determine the failure to properly tailor timely strategies, impose longitudinal studies which are still lacking within the field. The…

  17. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    NASA Astrophysics Data System (ADS)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  18. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform

    PubMed Central

    2013-01-01

    Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for

  19. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform.

    PubMed

    Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt

    2013-04-30

    Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server

  20. Numerical approach for unstructured quantum key distribution

    PubMed Central

    Coles, Patrick J.; Metodiev, Eric M.; Lütkenhaus, Norbert

    2016-01-01

    Quantum key distribution (QKD) allows for communication with security guaranteed by quantum theory. The main theoretical problem in QKD is to calculate the secret key rate for a given protocol. Analytical formulas are known for protocols with symmetries, since symmetry simplifies the analysis. However, experimental imperfections break symmetries, hence the effect of imperfections on key rates is difficult to estimate. Furthermore, it is an interesting question whether (intentionally) asymmetric protocols could outperform symmetric ones. Here we develop a robust numerical approach for calculating the key rate for arbitrary discrete-variable QKD protocols. Ultimately this will allow researchers to study ‘unstructured' protocols, that is, those that lack symmetry. Our approach relies on transforming the key rate calculation to the dual optimization problem, which markedly reduces the number of parameters and hence the calculation time. We illustrate our method by investigating some unstructured protocols for which the key rate was previously unknown. PMID:27198739