The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Event- and interval-based measurement of stuttering: a review.
Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret
2015-01-01
Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be an acceptable agreement. Explanation for high reproducibility values as well as parameter choice to report those data are discussed. Both interval- and event-based methodologies used trained or experienced judges for inter- and intra-judge determination and data were beyond the references for good reproducibility values. Inter- and intra-judge values were reported in different metric scales among event- and interval-based methods studies, making it unfeasible to quantify the agreement between the two methods. © 2014 Royal College of Speech and Language Therapists.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
Methodology for computing the burden of disease of adverse events following immunization.
McDonald, Scott A; Nijsten, Danielle; Bollaerts, Kaatje; Bauwens, Jorgen; Praet, Nicolas; van der Sande, Marianne; Bauchau, Vincent; de Smedt, Tom; Sturkenboom, Miriam; Hahné, Susan
2018-03-24
Composite disease burden measures such as disability-adjusted life-years (DALY) have been widely used to quantify the population-level health impact of disease or injury, but application has been limited for the estimation of the burden of adverse events following immunization. Our objective was to assess the feasibility of adapting the DALY approach for estimating adverse event burden. We developed a practical methodological framework, explicitly describing all steps involved: acquisition of relative or absolute risks and background event incidence rates, selection of disability weights and durations, and computation of the years lived with disability (YLD) measure, with appropriate estimation of uncertainty. We present a worked example, in which YLD is computed for 3 recognized adverse reactions following 3 childhood vaccination types, based on background incidence rates and relative/absolute risks retrieved from the literature. YLD provided extra insight into the health impact of an adverse event over presentation of incidence rates only, as severity and duration are additionally incorporated. As well as providing guidance for the deployment of DALY methodology in the context of adverse events associated with vaccination, we also identified where data limitations potentially occur. Burden of disease methodology can be applied to estimate the health burden of adverse events following vaccination in a systematic way. As with all burden of disease studies, interpretation of the estimates must consider the quality and accuracy of the data sources contributing to the DALY computation. © 2018 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.
Significant events in psychotherapy: An update of research findings.
Timulak, Ladislav
2010-11-01
Significant events research represents a specific approach to studying client-identified important moments in the therapy process. The current study provides an overview of the significant events research conducted, the methodology used together with findings and implications. PsychInfo database was searched with keywords such as significant events, important events, significant moments, important moments, and counselling or psychotherapy. The references of the selected studies were also searched. This process led to the identification of 41 primary studies that used client-identified significant event(s) as a main or secondary focus of the study. These were consequently reviewed with regard to their methodology and findings. The findings are presented according to type of study conducted. The impacts of helpful events reported by clients are focused on contributions to therapeutic relationship and to in-session outcomes. Hindering events focus on some client disappointment with the therapist or therapy. The group therapy modality highlighted additional helpful impacts (like learning from others). Perspectives on what is significant in therapy differ between clients and therapists. The intensive qualitative studies reviewed confirm that the processes involved in significant events are complex and ambiguous. Studies show that the helpful events may also contain many hindering elements and that specific events are deeply contextually embedded in the preceding events of therapy. Some studies suggest that helpful significant events are therapeutically productive although this may need to be established further. Specific intensive studies show that the clients' perceptions in therapy may differ dramatically from that of the therapist. Furthermore, the relational and emotional aspects of significant moments may be more important for the clients than the cognitive aspects of therapy which are frequently stressed by therapists. 2010 The British Psychological Society.
Mazzali, Cristina; Paganoni, Anna Maria; Ieva, Francesca; Masella, Cristina; Maistrello, Mauro; Agostoni, Ornella; Scalvini, Simonetta; Frigerio, Maria
2016-07-08
Administrative data are increasingly used in healthcare research. However, in order to avoid biases, their use requires careful study planning. This paper describes the methodological principles and criteria used in a study on epidemiology, outcomes and process of care of patients hospitalized for heart failure (HF) in the largest Italian Region, from 2000 to 2012. Data were extracted from the administrative data warehouse of the healthcare system of Lombardy, Italy. Hospital discharge forms with HF-related diagnosis codes were the basis for identifying HF hospitalizations as clinical events, or episodes. In patients experiencing at least one HF event, hospitalizations for any cause, outpatient services utilization, and drug prescriptions were also analyzed. Seven hundred one thousand, seven hundred one heart failure events involving 371,766 patients were recorded from 2000 to 2012. Once all the healthcare services provided to these patients after the first HF event had been joined together, the study database totalled about 91 million records. Principles, criteria and tips utilized in order to minimize errors and characterize some relevant subgroups are described. The methodology of this study could represent the basis for future research and could be applied in similar studies concerning epidemiology, trend analysis, and healthcare resources utilization.
Analyzing time-ordered event data with missed observations.
Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P
2017-09-01
A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event data.
ERIC Educational Resources Information Center
Pillemer, Karl; Chen, Emily K.; Van Haitsma, Kimberly S.; Teresi, Jeanne; Ramirez, Mildred; Silver, Stephanie; Sukha, Gail; Lachs, Mark S.
2012-01-01
Purpose: Despite its prevalence and negative consequences, research on elder abuse has rarely considered resident-to-resident aggression (RRA) in nursing homes. This study employed a qualitative event reconstruction methodology to identify the major forms of RRA that occur in nursing homes. Design and methods: Events of RRA were identified within…
NASA Astrophysics Data System (ADS)
Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi
2015-11-01
Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
Methodologies for the Statistical Analysis of Memory Response to Radiation
NASA Astrophysics Data System (ADS)
Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi
2016-08-01
Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].
Size Distributions of Solar Proton Events: Methodological and Physical Restrictions
NASA Astrophysics Data System (ADS)
Miroshnichenko, L. I.; Yanke, V. G.
2016-12-01
Based on the new catalogue of solar proton events (SPEs) for the period of 1997 - 2009 (Solar Cycle 23) we revisit the long-studied problem of the event-size distributions in the context of those constructed for other solar-flare parameters. Recent results on the problem of size distributions of solar flares and proton events are briefly reviewed. Even a cursory acquaintance with this research field reveals a rather mixed and controversial picture. We concentrate on three main issues: i) SPE size distribution for {>} 10 MeV protons in Solar Cycle 23; ii) size distribution of {>} 1 GV proton events in 1942 - 2014; iii) variations of annual numbers for {>} 10 MeV proton events on long time scales (1955 - 2015). Different results are critically compared; most of the studies in this field are shown to suffer from vastly different input datasets as well as from insufficient knowledge of underlying physical processes in the SPEs under consideration. New studies in this field should be made on more distinct physical and methodological bases. It is important to note the evident similarity in size distributions of solar flares and superflares in Sun-like stars.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling
Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.
Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.
Prescription-event monitoring: methodology and recent progress.
Rawson, N S; Pearce, G L; Inman, W H
1990-01-01
Event monitoring was first suggested 25 years ago as a way of detecting adverse reactions to drugs. Prescription-event monitoring (PEM), which has been developed by the Drug Safety Research Unit, is the first large-scale systematic post-marketing surveillance method to use event monitoring in the U.K. PEM identifies patients, who have been prescribed a particular drug, and their doctors from photocopies of National Health Service prescriptions which are processed centrally in England. A personalized follow-up questionnaire ("green form") is mailed to each patient's general practitioner, usually on the first anniversary of the initial prescription, asking for information about the patient, especially any "events" that he or she may have experienced since beginning treatment with the drug. The methodology of PEM is presented, together with examples of analyses that can be performed using results from recent studies. The problems and benefits of PEM are discussed.
Semicompeting risks in aging research: methods, issues and needs
Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen
2015-01-01
A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136
ERIC Educational Resources Information Center
Trimble, Joseph E.; And Others
A review of pertinent research on the adaptation of ethnic minority elderly to life-threatening events (personal, man-made, or natural) exposes voids in the research, presents methodological considerations, and indicates that ethnic minority elderly are disproportionately victimized by life-threatening events. Unusually high numbers of…
Studies of planning behavior of aircraft pilots in normal, abnormal and emergency situations
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.; Hillmann, K.
1981-01-01
A methodology for the study of planning is presented and the results of applying the methodology within two experimental investigations of planning behavior of aircraft pilots in normal, abnormal, and emergency situations are discussed. Beyond showing that the methodology yields consistent results, these experiments also lead to concepts in terms of a dichotomy between event driven and time driven planning, subtle effects of automation on planning, and the relationship of planning to workload and flight performance.
Varney, Shawn; Hirshon, Jon Mark; Dischinger, Patricia; Mackenzie, Colin
2006-01-01
The Haddon Matrix offers a classic epidemiological model for studying injury prevention. This methodology places the public health concepts of agent, host, and environment within the three sequential phases of an injury-producing incident-pre-event, event, and postevent. This study uses this methodology to illustrate how it could be applied in systematically preparing for a mass casualty disaster such as an unconventional sarin attack in a major urban setting. Nineteen city, state, federal, and military agencies responded to the Haddon Matrix chemical terrorism preparedness exercise and offered feedback in the data review session. Four injury prevention strategies (education, engineering, enforcement, and economics) were applied to the individual factors and event phases of the Haddon Matrix. The majority of factors identified in all phases were modifiable, primarily through educational interventions focused on individual healthcare providers and first responders. The Haddon Matrix provides a viable means of studying an unconventional problem, allowing for the identification of modifiable factors to decrease the type and severity of injuries following a mass casualty disaster such as a sarin release. This strategy could be successfully incorporated into disaster planning for other weapons attacks that could potentially cause mass casualties.
ERIC Educational Resources Information Center
Spaniol, Julia; Davidson, Patrick S. R.; Kim, Alice S. N.; Han, Hua; Moscovitch, Morris; Grady, Cheryl L.
2009-01-01
The recent surge in event-related fMRI studies of episodic memory has generated a wealth of information about the neural correlates of encoding and retrieval processes. However, interpretation of individual studies is hampered by methodological differences, and by the fact that sample sizes are typically small. We submitted results from studies of…
Multi-decadal Hydrological Retrospective: Case study of Amazon floods and droughts
NASA Astrophysics Data System (ADS)
Wongchuig Correa, Sly; Paiva, Rodrigo Cauduro Dias de; Espinoza, Jhan Carlo; Collischonn, Walter
2017-06-01
Recently developed methodologies such as climate reanalysis make it possible to create a historical record of climate systems. This paper proposes a methodology called Hydrological Retrospective (HR), which essentially simulates large rainfall datasets, using this as input into hydrological models to develop a record of past hydrology, making it possible to analyze past floods and droughts. We developed a methodology for the Amazon basin, where studies have shown an increase in the intensity and frequency of hydrological extreme events in recent decades. We used eight large precipitation datasets (more than 30 years) as input for a large scale hydrological and hydrodynamic model (MGB-IPH). HR products were then validated against several in situ discharge gauges controlling the main Amazon sub-basins, focusing on maximum and minimum events. For the most accurate HR, based on performance metrics, we performed a forecast skill of HR to detect floods and droughts, comparing the results with in-situ observations. A statistical temporal series trend was performed for intensity of seasonal floods and droughts in the entire Amazon basin. Results indicate that HR could represent most past extreme events well, compared with in-situ observed data, and was consistent with many events reported in literature. Because of their flow duration, some minor regional events were not reported in literature but were captured by HR. To represent past regional hydrology and seasonal hydrological extreme events, we believe it is feasible to use some large precipitation datasets such as i) climate reanalysis, which is mainly based on a land surface component, and ii) datasets based on merged products. A significant upward trend in intensity was seen in maximum annual discharge (related to floods) in western and northwestern regions and for minimum annual discharge (related to droughts) in south and central-south regions of the Amazon basin. Because of the global coverage of rainfall datasets, this methodology can be transferred to other regions for better estimation of future hydrological behavior and its impact on society.
Chamala, Srikar; Feng, Guanqiao; Chavarro, Carolina; Barbazuk, W. Brad
2015-01-01
Alternative splicing (AS) plays important roles in many plant functions, but its conservation across the plant kingdom is not known. We describe a methodology to identify AS events and identify conserved AS events across large phylogenetic distances using RNA-Seq datasets. We applied this methodology to transcriptome data from nine angiosperms including Amborella, the single sister species to all other extant flowering plants. AS events within 40–70% of the expressed multi-exonic genes per species were found, 27,120 of which are conserved among two or more of the taxa studied. While many events are species specific, many others are shared across long evolutionary distances suggesting they have functional significance. Conservation of AS event data provides an estimate of the number of ancestral AS events present at each node of the tree representing the nine species studied. Furthermore, the presence or absence of AS isoforms between species with different whole genome duplication (WGD) histories provides the opportunity to examine the impact of WDG on AS potential. Examining AS in gene families identifies those with high rates of AS, and conservation can distinguish ancient events vs. recent or species specific adaptations. The MADS-box and SR protein families are found to represent families with low and high occurrences of AS, respectively, yet their AS events were likely present in the MRCA of angiosperms. PMID:25859541
A methodology to select a wire insulation for use in habitable spacecraft.
Paulos, T; Apostolakis, G
1998-08-01
This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.
Evidence-Based Psychosocial Treatments for Children and Adolescents Exposed to Traumatic Events
ERIC Educational Resources Information Center
Silverman, Wendy K.; Ortiz, Claudio D.; Viswesvaran, Chockalingham; Burns, Barbara J.; Kolko, David J.; Putnam, Frank W.; Amaya-Jackson, Lisa
2008-01-01
The article reviews the current status (1993-2007) of psychosocial treatments for children and adolescents who have been exposed to traumatic events. Twenty-one treatment studies are evaluated using criteria from Nathan and Gorman (2002) along a continuum of methodological rigor ranging from Type 1 to Type 6. All studies were, at a minimum, robust…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Hydrological Retrospective of floods and droughts: Case study in the Amazon
NASA Astrophysics Data System (ADS)
Wongchuig Correa, Sly; Cauduro Dias de Paiva, Rodrigo; Carlo Espinoza Villar, Jhan; Collischonn, Walter
2017-04-01
Recent studies have reported an increase in intensity and frequency of hydrological extreme events in many regions of the Amazon basin over last decades, these events such as seasonal floods and droughts have originated a significant impact in human and natural systems. Recently, methodologies such as climatic reanalysis are being developed in order to create a coherent register of climatic systems, thus taking this notion, this research efforts to produce a methodology called Hydrological Retrospective (HR), that essentially simulate large rainfall datasets over hydrological models in order to develop a record over past hydrology, enabling the analysis of past floods and droughts. We developed our methodology on the Amazon basin, thus we used eight large precipitation datasets (more than 30 years) through a large scale hydrological and hydrodynamic model (MGB-IPH), after that HR products were validated against several in situ discharge gauges dispersed throughout Amazon basin, given focus in maximum and minimum events. For better HR results according performance metrics, we performed a forecast skill of HR to detect floods and droughts considering in-situ observations. Furthermore, statistical temporal series trend was performed for intensity of seasonal floods and drought in the whole Amazon basin. Results indicate that better HR represented well most past extreme events registered by in-situ observed data and also showed coherent with many events cited by literature, thus we consider viable to use some large precipitation datasets as climatic reanalysis mainly based on land surface component and datasets based in merged products for represent past regional hydrology and seasonal hydrological extreme events. On the other hand, an increase trend of intensity was realized for maximum annual discharges (related to floods) in north-western regions and for minimum annual discharges (related to drought) in central-south regions of the Amazon basin, these features were previously detected by other researches. In the whole basin, we estimated an upward trend of maximum annual discharges at Amazon River. In order to estimate better future hydrological behavior and their impacts on the society, HR could be used as a methodology to understand past extreme events occurrence in many places considering the global coverage of rainfall datasets.
Policy Expansion of School Choice in the American States
ERIC Educational Resources Information Center
Wong, Kenneth K.; Langevin, Warren E.
2007-01-01
This research study explores the policy expansion of school choice within the methodological approach of event history analysis. The first section provides a comparative overview of state adoption of public school choice laws. After creating a statistical portrait of the contemporary landscape for school choice, the authors introduce event history…
Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan
2017-02-01
Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kelly, Nick; Montenegro, Maximiliano; Gonzalez, Carlos; Clasing, Paula; Sandoval, Augusto; Jara, Magdalena; Saurina, Elvira; Alarcón, Rosa
2017-01-01
Purpose: The purpose of this paper is to demonstrate the utility of combining event-centred and variable-centred approaches when analysing big data for higher education institutions. It uses a large, university-wide data set to demonstrate the methodology for this analysis by using the case study method. It presents empirical findings about…
Automatic detection of freezing of gait events in patients with Parkinson's disease.
Tripoliti, Evanthia E; Tzallas, Alexandros T; Tsipouras, Markos G; Rigas, George; Bougia, Panagiota; Leontiou, Michael; Konitsiotis, Spiros; Chondrogiorgi, Maria; Tsouli, Sofia; Fotiadis, Dimitrios I
2013-04-01
The aim of this study is to detect freezing of gait (FoG) events in patients suffering from Parkinson's disease (PD) using signals received from wearable sensors (six accelerometers and two gyroscopes) placed on the patients' body. For this purpose, an automated methodology has been developed which consists of four stages. In the first stage, missing values due to signal loss or degradation are replaced and then (second stage) low frequency components of the raw signal are removed. In the third stage, the entropy of the raw signal is calculated. Finally (fourth stage), four classification algorithms have been tested (Naïve Bayes, Random Forests, Decision Trees and Random Tree) in order to detect the FoG events. The methodology has been evaluated using several different configurations of sensors in order to conclude to the set of sensors which can produce optimal FoG episode detection. Signals recorded from five healthy subjects, five patients with PD who presented the symptom of FoG and six patients who suffered from PD but they do not present FoG events. The signals included 93 FoG events with 405.6s total duration. The results indicate that the proposed methodology is able to detect FoG events with 81.94% sensitivity, 98.74% specificity, 96.11% accuracy and 98.6% area under curve (AUC) using the signals from all sensors and the Random Forests classification algorithm. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A Methodology to Compare Insulin Dosing Recommendations in Real-Life Settings.
Groat, Danielle; Grando, Maria A; Thompson, Bithika; Neto, Pedro; Soni, Hiral; Boyle, Mary E; Bailey, Marilyn; Cook, Curtiss B
2017-11-01
We propose a methodology to analyze complex real-life glucose data in insulin pump users. Patients with type 1 diabetes (T1D) on insulin pumps were recruited from an academic endocrinology practice. Glucose data, insulin bolus (IB) amounts, and self-reported alcohol consumption and exercise events were collected for 30 days. Rules were developed to retrospectively compare IB recommendations from the insulin pump bolus calculator (IPBC) against recommendations from a proposed decision aid (PDA) and for assessing the PDA's recommendation for exercise and alcohol. Data from 15 participants were analyzed. When considering instances where glucose was below target, the PDA recommended a smaller dose in 14%, but a larger dose in 13% and an equivalent IB in 73%. For glucose levels at target, the PDA suggested an equivalent IB in 58% compared to the subject's IPBC, but higher doses in 20% and lower in 22%. In events where postprandial glucose was higher than target, the PDA suggested higher doses in 25%, lower doses in 13%, and equivalent doses in 62%. In 64% of all alcohol events the PDA would have provided appropriate advice. In 75% of exercise events, the PDA appropriately advised an IB, a carbohydrate snack, or neither. This study provides a methodology to systematically analyze real-life data generated by insulin pumps and allowed a preliminary analysis of the performance of the PDA for insulin dosing. Further testing of the methodological approach in a broader diabetes population and prospective testing of the PDA are needed.
Coupling Post-Event and Prospective Analyses for El Niño-related Risk Reduction in Peru
NASA Astrophysics Data System (ADS)
French, Adam; Keating, Adriana; Mechler, Reinhard; Szoenyi, Michael; Cisneros, Abel; Chuquisengo, Orlando; Etienne, Emilie; Ferradas, Pedro
2017-04-01
Analyses in the wake of natural disasters play an important role in identifying how ex ante risk reduction and ex post hazard response activities have both succeeded and fallen short in specific contexts, thereby contributing to recommendations for improving such measures in the future. Event analyses have particular relevance in settings where disasters are likely to reoccur, and especially where recurrence intervals are short. This paper applies the Post Event Review Capability (PERC) methodology to the context of frequently reoccurring El Niño Southern Oscillation (ENSO) events in the country of Peru, where over the last several decades ENSO impacts have generated high levels of damage and economic loss. Rather than analyzing the impacts of a single event, this study builds upon the existing PERC methodology by combining empirical event analysis with a critical examination of risk reduction and adaptation measures implemented both prior to and following several ENSO events in the late 20th and early 21st centuries. Additionally, the paper explores linking the empirical findings regarding the uptake and outcomes of particular risk reduction and adaptation strategies to a prospective, scenario-based approach for projecting risk several decades into the future.
Impact of Life Events on the Relapse of Schizophrenic Patients
ERIC Educational Resources Information Center
Hussein, Hassan Ali; Jacoob, Shirooq; Sharour, Loai Abu
2016-01-01
Objectives: To investigate the relationship between stressful life events at the time of relapse in schizophrenic patients at psychiatric hospitals in Baghdad city. Methodology: A purposive (non-probability) sampling of 50 schizophrenic patients who have relapsed was involved in the present study. Data were collected through the use of the…
Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin
2013-04-15
There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study" method than using the "Global Trigger Tool". Differences in review methodology, perception of less severe adverse events and context knowledge may explain the observed difference between two expert review teams in the detection of adverse events.
Exploring Modes of Communication among Pupils in Brazil: Gender Issues in Academic Performance
ERIC Educational Resources Information Center
Teixeira, Adla B. M.; Villani, Carlos E.; do Nascimento, Silvania S.
2008-01-01
The objective of this study was to identify gender issues in the academic performance of boys and girls during physics classes in a laboratory. The methodology adopted was the observation and interactions of pupils during eight classroom events. The interactions were recorded and events were informally discussed with the teacher. The school…
SME Innovation and Learning: The Role of Networks and Crisis Events
ERIC Educational Resources Information Center
Saunders, Mark N. K.; Gray, David E; Goregaokar, Harshita
2014-01-01
Purpose: The purpose of this paper is to contribute to the literature on innovation and entrepreneurial learning by exploring how SMEs learn and innovate, how they use both formal and informal learning and in particular the role of networks and crisis events within their learning experience. Design/methodology/approach: Mixed method study,…
Schlesinger, Sabrina; Sonntag, Svenja R.
2016-01-01
Background A growing number of studies linked elevated concentrations of circulating asymmetric (ADMA) and symmetric (SDMA) dimethylarginine to mortality and cardiovascular disease (CVD) events. To summarize the evidence, we conducted a systematic review and quantified associations of ADMA and SDMA with the risks of all-cause mortality and incident CVD in meta-analyses accounting for different populations and methodological approaches of the studies. Methods Relevant studies were identified in PubMed until February 2015. We used random effect models to obtain summary relative risks (RR) and 95% confidence intervals (95%CIs), comparing top versus bottom tertiles. Dose-response relations were assessed by restricted cubic spline regression models and potential non-linearity was evaluated using a likelihood ratio test. Heterogeneity between subgroups was assessed by meta-regression analysis. Results For ADMA, 34 studies (total n = 32,428) investigating associations with all-cause mortality (events = 5,035) and 30 studies (total n = 30,624) investigating the association with incident CVD (events = 3,396) were included. The summary RRs (95%CI) for all-cause mortality were 1.52 (1.37–1.68) and for CVD 1.33 (1.22–1.45), comparing high versus low ADMA concentrations. Slight differences were observed across study populations and methodological approaches, with the strongest association of ADMA being reported with all-cause mortality in critically ill patients. For SDMA, 17 studies (total n = 18,163) were included for all-cause mortality (events = 2,903), and 13 studies (total n = 16,807) for CVD (events = 1,534). High vs. low levels of SDMA, were associated with increased risk of all-cause mortality [summary RR (95%CI): 1.31 (1.18–1.46)] and CVD [summary RR (95%CI): 1.36 (1.10–1.68) Strongest associations were observed in general population samples. Conclusions The dimethylarginines ADMA and SDMA are independent risk markers for all-cause mortality and CVD across different populations and methodological approaches. PMID:27812151
Major Life Events and Daily Hassles in Predicting Health Status: Methodological Inquiry.
ERIC Educational Resources Information Center
Flannery, Raymond B., Jr.
1986-01-01
Hypothesized that both major life events and daily hassles would be associated with anxiety and depression symptomatology. While the results partially support the hypothesis, the inconsistent findings suggest methodological flaws in each life stress measure. Reviews these limitations and presents the use of the semi-structured interview as one…
Andrew Hill; Jay Beaman; Joseph O' Leary
2001-01-01
This paper is about estimating a salience scale for trip reporting. The measurement project began as a way of establishing the affects of methodological changes between 1994 and 1997 in the Canadian Travel Survey. This is a survey that Canada uses to study the travel of its residents. There were several changes in methodology that could be expected to influence how...
Applying Lean principles and Kaizen rapid improvement events in public health practice.
Smith, Gene; Poteat-Godwin, Annah; Harrison, Lisa Macon; Randolph, Greg D
2012-01-01
This case study describes a local home health and hospice agency's effort to implement Lean principles and Kaizen methodology as a rapid improvement approach to quality improvement. The agency created a cross-functional team, followed Lean Kaizen methodology, and made significant improvements in scheduling time for home health nurses that resulted in reduced operational costs, improved working conditions, and multiple organizational efficiencies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Early Life Exposures and Cancer
Early-life events and exposures have important consequences for cancer development later in life, however, epidemiological studies of early-life factors and cancer development later in life have had significant methodological challenges.
An innovative and shared methodology for event reconstruction using images in forensic science.
Milliet, Quentin; Jendly, Manon; Delémont, Olivier
2015-09-01
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Tsai, Chu-Lin; Camargo, Carlos A
2009-09-01
Acute exacerbations of chronic disease are ubiquitous in clinical medicine, and thus far, there has been a paucity of integrated methodological discussion on this phenomenon. We use acute exacerbations of chronic obstructive pulmonary disease as an example to emphasize key epidemiological and statistical issues for this understudied field in clinical epidemiology. Directed acyclic graphs are a useful epidemiological tool to explain the differential effects of risk factor on health outcomes in studies of acute and chronic phases of disease. To study the pathogenesis of acute exacerbations of chronic disease, case-crossover design and time-series analysis are well-suited study designs to differentiate acute and chronic effect. Modeling changes over time and setting appropriate thresholds are important steps to separate acute from chronic phases of disease in serial measurements. In statistical analysis, acute exacerbations are recurrent events, and some individuals are more prone to recurrences than others. Therefore, appropriate statistical modeling should take into account intraindividual dependence. Finally, we recommend the use of "event-based" number needed to treat (NNT) to prevent a single exacerbation instead of traditional patient-based NNT. Addressing these methodological challenges will advance research quality in acute on chronic disease epidemiology.
2012-01-01
Background Adverse consequences of medical interventions are a source of concern, but clinical trials may lack power to detect elevated rates of such events, while observational studies have inherent limitations. Meta-analysis allows the combination of individual studies, which can increase power and provide stronger evidence relating to adverse events. However, meta-analysis of adverse events has associated methodological challenges. The aim of this study was to systematically identify and review the methodology used in meta-analyses where a primary outcome is an adverse or unintended event, following a therapeutic intervention. Methods Using a collection of reviews identified previously, 166 references including a meta-analysis were selected for review. At least one of the primary outcomes in each review was an adverse or unintended event. The nature of the intervention, source of funding, number of individual meta-analyses performed, number of primary studies included in the review, and use of meta-analytic methods were all recorded. Specific areas of interest relating to the methods used included the choice of outcome metric, methods of dealing with sparse events, heterogeneity, publication bias and use of individual patient data. Results The 166 included reviews were published between 1994 and 2006. Interventions included drugs and surgery among other interventions. Many of the references being reviewed included multiple meta-analyses with 44.6% (74/166) including more than ten. Randomised trials only were included in 42.2% of meta-analyses (70/166), observational studies only in 33.7% (56/166) and a mix of observational studies and trials in 15.7% (26/166). Sparse data, in the form of zero events in one or both arms where the outcome was a count of events, was found in 64 reviews of two-arm studies, of which 41 (64.1%) had zero events in both arms. Conclusions Meta-analyses of adverse events data are common and useful in terms of increasing the power to detect an association with an intervention, especially when the events are infrequent. However, with regard to existing meta-analyses, a wide variety of different methods have been employed, often with no evident rationale for using a particular approach. More specifically, the approach to dealing with zero events varies, and guidelines on this issue would be desirable. PMID:22553987
Degrees of Consciousness in the Communication of Actions and Events on the Visual Cliff. No. 58.
ERIC Educational Resources Information Center
Bierschenk, Bernhard
The consciousness of dizygotic twins in their communication of actions and events as seen in the visual cliff pictures published by E. J. Gibson and R. D. Walk (1960) was studied in Sweden. In the process of communication, many different state spaces are generated. The methodology demonstrates that ecological and biophysical properties of language…
Event-related brain potentials and the study of reward processing: Methodological considerations.
Krigolson, Olave E
2017-11-14
There is growing interest in using electroencephalography and specifically the event-related brain potential (ERP) methodology to study human reward processing. Since the discovery of the feedback related negativity (Miltner et al., 1997) and the development of theories associating the feedback related negativity and more recently the reward positivity with reinforcement learning, midbrain dopamine function, and the anterior cingulate cortex (i.e., Holroyd and Coles, 2002) researchers have used the ERP methodology to probe the neural basis of reward learning in humans. However, examination of the feedback related negativity and the reward positivity cannot be done without an understanding of some key methodological issues that must be taken into account when using ERPs and examining these ERP components. For example, even the component name - the feedback related negativity - is a source of debate within the research community as some now strongly feel that the component should be named the reward positivity (Proudfit, 2015). Here, ten key methodological issues are discussed - confusion in component naming, the reward positivity, component identification, peak quantification and the use of difference waveforms, frequency (the N200) and component contamination (the P300), the impact of feedback timing, action, and task learnability, and how learning results in changes in the amplitude of the feedback-related negativity/reward positivity. The hope here is to not provide a definitive approach for examining the feedback related negativity/reward positivity, but instead to outline the key issues that must be taken into account when examining this component to assist researchers in their study of human reward processing with the ERP methodology. Copyright © 2017 Elsevier B.V. All rights reserved.
Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari
2002-01-01
The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
The Carbon Aerosol / Particles Nucleation with a Lidar: Numerical Simulations and Field Studies
NASA Astrophysics Data System (ADS)
Miffre, Alain; Anselmo, Christophe; Francis, Mirvatte; David, Gregory; Rairoux, Patrick
2016-06-01
In this contribution, we present the results of two recent papers [1,2] published in Optics Express, dedicated to the development of two new lidar methodologies. In [1], while the carbon aerosol (for example, soot particles) is recognized as a major uncertainty on climate and public health, we couple lidar remote sensing with Laser-Induced-Incandescence (LII) to allow retrieving the vertical profile of very low thermal radiation emitted by the carbon aerosol, in agreement with Planck's law, in an urban atmosphere over several hundred meters altitude. In paper [2], awarded as June 2014 OSA Spotlight, we identify the optical requirements ensuring an elastic lidar to be sensitive to new particles formation events (NPF-events) in the atmosphere, while, in the literature, all the ingredients initiating nucleation are still being unrevealed [3]. Both papers proceed with the same methodology by identifying the optical requirements from numerical simulation (Planck and Kirchhoff's laws in [1], Mie and T-matrix numerical codes in [2]), then presenting lidar field application case studies. We believe these new lidar methodologies may be useful for climate, geophysical, as well as fundamental purposes.
Calibration methodology for proportional counters applied to yield measurements of a neutron burst.
Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo
2014-01-01
This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.
Hart, Kenneth E; Fazaa, Norman
2004-07-01
This study examined the relationship between life stress events and level of alcohol misuse using two stress indices. The first index consisted of stress events that are not likely to be caused by alcohol misuse (i.e., alcohol uncontaminated stress events). The second stress index consisted of items that were judged as being likely consequences of alcohol misuse (i.e., alcohol contaminated stress events). Results based on a questionnaire study of 378 undergraduates in 2000 showed that level of alcohol misuse was much more strongly related to alcohol contaminated life stress events than alcohol uncontaminated life events. Comparative analysis of the coefficients of determination indicated the effect size of the association to alcohol contaminated life stress events was 240% larger than the corresponding effect size for the association to alcohol uncontaminated life events. Results suggest that studies, which are tests of the tension reduction hypothesis, should employ greater methodological rigor to ensure measures of life stress events are not inadvertently assessing the consequences of alcohol misuse. The results highlight the need to distinguish between stressful life events that contribute to alcohol misuse and stressful life events that are consequential to alcohol misuse.
Portal Surveys of Time-Out Drinking Locations: A Tool for Studying Binge Drinking and AOD Use
ERIC Educational Resources Information Center
Voas, Robert B.; Furr-Holden, Debra; Lauer, Elizabeth; Bright, Kristin; Johnson, Mark B.; Miller, Brenda
2006-01-01
Portal surveys, defined as assessments occurring proximal to the entry point to a high-risk locale and immediately on exit, can be used in different settings to measure characteristics and behavior of attendees at an event of interest. This methodology has been developed to assess alcohol and other drug (AOD) use at specific events and has…
5 CFR 847.604 - Methodology for determining deficiency.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Methodology for determining deficiency. (a) When an event listed in the left column of the table in Appendix A... transfer to the Fund under subpart E of this part including earnings under § 847.507. (2) OPM will add the... (a)(2) of this section is greater than zero, the deficiency is equal to that amount. (c) If no event...
5 CFR 847.604 - Methodology for determining deficiency.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Methodology for determining deficiency. (a) When an event listed in the left column of the table in Appendix A... transfer to the Fund under subpart E of this part including earnings under § 847.507. (2) OPM will add the... (a)(2) of this section is greater than zero, the deficiency is equal to that amount. (c) If no event...
5 CFR 847.604 - Methodology for determining deficiency.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Methodology for determining deficiency. (a) When an event listed in the left column of the table in Appendix A... transfer to the Fund under subpart E of this part including earnings under § 847.507. (2) OPM will add the... (a)(2) of this section is greater than zero, the deficiency is equal to that amount. (c) If no event...
Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric
2015-10-01
Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Bas, Esra
2014-07-01
In this paper, an integrated methodology for Quality Function Deployment (QFD) and a 0-1 knapsack model is proposed for occupational safety and health as a systems thinking approach. The House of Quality (HoQ) in QFD methodology is a systematic tool to consider the inter-relationships between two factors. In this paper, three HoQs are used to consider the interrelationships between tasks and hazards, hazards and events, and events and preventive/protective measures. The final priority weights of events are defined by considering their project-specific preliminary weights, probability of occurrence, and effects on the victim and the company. The priority weights of the preventive/protective measures obtained in the last HoQ are fed into a 0-1 knapsack model for the investment decision. Then, the selected preventive/protective measures can be adapted to the task design. The proposed step-by-step methodology can be applied to any stage of a project to design the workplace for occupational safety and health, and continuous improvement for safety is endorsed by the closed loop characteristic of the integrated methodology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Source space analysis of event-related dynamic reorganization of brain networks.
Ioannides, Andreas A; Dimitriadis, Stavros I; Saridis, George A; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A
2012-01-01
How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications.
Santaguida, Pasqualina; Oremus, Mark; Walker, Kathryn; Wishart, Laurie R; Siegel, Karen Lohmann; Raina, Parminder
2012-04-01
A "review of reviews" was undertaken to assess methodological issues in studies evaluating nondrug rehabilitation interventions in stroke patients. MEDLINE, CINAHL, PsycINFO, and the Cochrane Database of Systematic Reviews were searched from January 2000 to January 2008 within the stroke rehabilitation setting. Electronic searches were supplemented by reviews of reference lists and citations identified by experts. Eligible studies were systematic reviews; excluded citations were narrative reviews or reviews of reviews. Review characteristics and criteria for assessing methodological quality of primary studies within them were extracted. The search yielded 949 English-language citations. We included a final set of 38 systematic reviews. Cochrane reviews, which have a standardized methodology, were generally of higher methodological quality than non-Cochrane reviews. Most systematic reviews used standardized quality assessment criteria for primary studies, but not all were comprehensive. Reviews showed that primary studies had problems with randomization, allocation concealment, and blinding. Baseline comparability, adverse events, and co-intervention or contamination were not consistently assessed. Blinding of patients and providers was often not feasible and was not evaluated as a source of bias. The eligible systematic reviews identified important methodological flaws in the evaluated primary studies, suggesting the need for improvement of research methods and reporting. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Steinhauer, Karsten
2014-01-01
This article provides a selective overview of recent event-related brain potential (ERP) studies in L2 morpho-syntax, demonstrating that the ERP evidence supporting the critical period hypothesis (CPH) may be less compelling than previously thought. The article starts with a general introduction to ERP methodology and language-related ERP profiles…
Analysis of Additive Manufacturing for Sustainment of Naval Aviation Systems
2017-09-01
selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The methodology organizes...a component selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The... methodology organizes the resultant data using a top-down approach that aligns technical feasibility with programmatic objectives. Finally, a discrete event
Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.
Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D
2016-04-01
Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.
Infantry Weapons Test METHODOLOGY Study. Volume 2. Antitank Weapons Test Methodology
1972-01-17
the event of a hit so located (e..g., on the treads) as to disable a tank. A - I 14 AtIjj’ LA...which disable the tank’s guns. Guided Missile - An unmanned vohicle moving above the surface of the earth, whose trajectory or flight path is capable...attitude and well-being. People can be classed according to a need to achievje and the characteristics displayed by high-need achievers and low-need
Design risk assessment for burst-prone mines: Application in a Canadian mine
NASA Astrophysics Data System (ADS)
Cheung, David J.
A proactive stance towards improving the effectiveness and consistency of risk assessments has been adopted recently by mining companies and industry. The next 10-20 years forecasts that ore deposits accessible using shallow mining techniques will diminish. The industry continues to strive for success in "deeper" mining projects in order to keep up with the continuing demand for raw materials. Although the returns are quite profitable, many projects have been sidelined due to high uncertainty and technical risk in the mining of the mineral deposit. Several hardrock mines have faced rockbursting and seismicity problems. Within those reported, mines in countries like South Africa, Australia and Canada have documented cases of severe rockburst conditions attributed to the mining depth. Severe rockburst conditions known as "burst-prone" can be effectively managed with design. Adopting a more robust design can ameliorate the exposure of workers and equipment to adverse conditions and minimize the economic consequences, which can hinder the bottom line of an operation. This thesis presents a methodology created for assessing the design risk in burst-prone mines. The methodology includes an evaluation of relative risk ratings for scenarios with options of risk reduction through several design principles. With rockbursts being a hazard of seismic events, the methodology is based on research in the area of mining seismicity factoring in rockmass failure mechanisms, which results from a combination of mining induced stress, geological structures, rockmass properties and mining influences. The methodology was applied to case studies at Craig Mine of Xstrata Nickel in Sudbury, Ontario, which is known to contain seismically active fault zones. A customized risk assessment was created and applied to rockburst case studies, evaluating the seismic vulnerability and consequence for each case. Application of the methodology to Craig Mine demonstrates that changes in the design can reduce both exposure risk (personnel and equipment), and economical risk (revenue and costs). Fatal and catastrophic consequences can be averted through robust planning and design. Two customized approaches were developed to conduct risk assessment of case studies at Craig Mine. Firstly, the Brownfield Approach utilizes the seismic database to determine the seismic hazard from a rating system that evaluates frequency-magnitude, event size, and event-blast relation. Secondly, the Greenfield Approach utilizes the seismic database, focusing on larger magnitude events, rocktype, and geological structure. The customized Greenfield Approach can also be applied in the evaluation of design risk in deep mines with the same setting and condition as Craig Mine. Other mines with different settings and conditions can apply the principles in the methodology to evaluate design alternatives and risk reduction strategies for burst-prone mines.
Life Cycle Assessment to support the quantification of the environmental impacts of an event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toniolo, Sara; Mazzi, Anna; Fedele, Andrea
In recent years, several tools have been used to define and quantify the environmental impacts associated with an event; however, a lack of uniform approaches for conducting environmental evaluations has been revealed. The aim of this paper is to evaluate whether the Life Cycle Assessment methodology, which is rarely applied to an event, can be an appropriate tool for calculating the environmental impacts associated with the assembly, disassembly, and use phase of an event analysing in particular the components and the displays used to establish the exhibits. The aim is also to include the issues reported by ISO 20121:2012 involvingmore » the interested parties that can be monitored but also affected by the event owner, namely the event organiser, the workforce and the supply chain. A small event held in Northern Italy was selected as the subject of the research. The results obtained show that the main contributors are energy consumption for lighting and heating and the use of aluminium materials, such as bars for supporting the spotlights, carpet and the electronic equipment. A sensitivity analysis for estimating the effects of the impact assessment method chosen has also been conducted and an uncertainty analysis has been performed using the Monte Carlo technique. This study highlighted the importance of the energy consumed by heating and lighting on the environmental implications, and indicated that the preparation and assembly should always be considered when quantifying the environmental profile of an event. - Highlights: • LCA methodology, developed for products and services, is applied to an event. • A small event held in Northern Italy is analysed. • The main contributors are energy consumption and the use of aluminium and carpet. • Exhibition site preparation can have important environmental implications. • This study demonstrates the importance of the assembly, disassembly and use phase.« less
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
Focal Event, Contextualization, and Effective Communication in the Mathematics Classroom
ERIC Educational Resources Information Center
Nilsson, Per; Ryve, Andreas
2010-01-01
The aim of this article is to develop analytical tools for studying mathematical communication in collaborative activities. The theoretical construct of contextualization is elaborated methodologically in order to study diversity in individual thinking in relation to effective communication. The construct of contextualization highlights issues of…
Single event test methodology for integrated optoelectronics
NASA Technical Reports Server (NTRS)
Label, Kenneth A.; Cooley, James A.; Stassinopoulos, E. G.; Marshall, Paul; Crabtree, Christina
1993-01-01
A single event upset (SEU), defined as a transient or glitch on the output of a device, and its applicability to integrated optoelectronics are discussed in the context of spacecraft design and the need for more than a bit error rate viewpoint for testing and analysis. A methodology for testing integrated optoelectronic receivers and transmitters for SEUs is presented, focusing on the actual test requirements and system schemes needed for integrated optoelectronic devices. Two main causes of single event effects in the space environment, including protons and galactic cosmic rays, are considered along with ground test facilities for simulating the space environment.
A Process Study of the Development of Virtual Research Environments
NASA Astrophysics Data System (ADS)
Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.
2014-05-01
In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.
Regression Analysis of Mixed Panel Count Data with Dependent Terminal Events
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L.
2017-01-01
Event history studies are commonly conducted in many fields and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data above, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally the methodology is applied to a childhood cancer study that motivated this study. PMID:28098397
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...
2017-07-14
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azim, Riyasat; Li, Fangxing; Xue, Yaosuo
Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less
[IBEAS design: adverse events prevalence in Latin American hospitals].
Aranaz-Andrés, J M; Aibar-Remón, C; Limón-Ramírez, R; Amarilla, A; Restrepo, F R; Urroz, O; Sarabia, O; Inga, R; Santivañez, A; Gonseth-García, J; Larizgoitia-Jauregui, I; Agra-Varela, Y; Terol-García, E
2011-01-01
To describe the methodological characteristics of the IBEAS study: adverse events prevalence in Latin American hospitals, with the aim of analysing the magnitude, significance and impact of adverse events (AE); to identify the main problems associated with patient safety AE; to increase the capacity of professionals involved in patient safety; and the setting up of patient safety agendas in the participating countries. A patient safety study launched in 35 Latin American hospitals through the analysis of AE in 5 countries: Argentina, Colombia, Costa Rica, Mexico and Peru, using a cross-sectional study using a review of clinical records as the main method. The implications of using a cross-sectional design when studying AE are described, in terms of resources required, internal validity and usefulness related to risk management. The cross-sectional design seems an efficient methodology in terms of time and resources spent, as well as being easy to carry out. Although the cross-sectional design does not review the all hospital episodes, it is able to provide a reliable estimate of prevalence and to support a surveillance system. Because of a possible survival bias, it is likely that the AE which led to hospital admissions will be overestimated, as well as the health related infections or those adverse events which are difficult to identify if the patient is not examined (e.g. contusions). Communication with the ward staff (if the patient is still hospitalised) help in finding the causality and their prevention. Copyright © 2010 SECA. Published by Elsevier Espana. All rights reserved.
Uher, Jana
2015-12-01
Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".
Measuring individual differences in responses to date-rape vignettes using latent variable models.
Tuliao, Antover P; Hoffman, Lesa; McChargue, Dennis E
2017-01-01
Vignette methodology can be a flexible and powerful way to examine individual differences in response to dangerous real-life scenarios. However, most studies underutilize the usefulness of such methodology by analyzing only one outcome, which limits the ability to track event-related changes (e.g., vacillation in risk perception). The current study was designed to illustrate the dynamic influence of risk perception on exit point from a date-rape vignette. Our primary goal was to provide an illustrative example of how to use latent variable models for vignette methodology, including latent growth curve modeling with piecewise slopes, as well as latent variable measurement models. Through the combination of a step-by-step exposition in this text and corresponding model syntax available electronically, we detail an alternative statistical "blueprint" to enhance future violence research efforts using vignette methodology. Aggr. Behav. 43:60-73, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Examining the social ecology of a bar-crawl: An exploratory pilot study.
Clapp, John D; Madden, Danielle R; Mooney, Douglas D; Dahlquist, Kristin E
2017-01-01
Many of the problems associated with alcohol occur after a single drinking event (e.g. drink driving, assault). These acute alcohol problems have a huge global impact and account for a large percentage of unintentional and intentional injuries in the world. Nonetheless, alcohol research and preventive interventions rarely focus on drinking at the event-level since drinking events are complex, dynamic, and methodologically challenging to observe. This exploratory study provides an example of how event-level data may be collected, analyzed, and interpreted. The drinking behavior of twenty undergraduate students enrolled at a large Midwestern public university was observed during a single bar crawl event that is organized by students annually. Alcohol use was monitored with transdermal alcohol devices coupled with ecological momentary assessments and geospatial data. "Small N, Big Data" studies have the potential to advance health behavior theory and to guide real-time interventions. However, such studies generate large amounts of within subject data that can be challenging to analyze and present. This study examined how to visually display event-level data and also explored the relationship between some basic indicators and alcohol consumption.
Egleston, Brian L.; Scharfstein, Daniel O.; MacKenzie, Ellen
2008-01-01
We focus on estimation of the causal effect of treatment on the functional status of individuals at a fixed point in time t* after they have experienced a catastrophic event, from observational data with the following features: (1) treatment is imposed shortly after the event and is non-randomized, (2) individuals who survive to t* are scheduled to be interviewed, (3) there is interview non-response, (4) individuals who die prior to t* are missing information on pre-event confounders, (5) medical records are abstracted on all individuals to obtain information on post-event, pre-treatment confounding factors. To address the issue of survivor bias, we seek to estimate the survivor average causal effect (SACE), the effect of treatment on functional status among the cohort of individuals who would survive to t* regardless of whether or not assigned to treatment. To estimate this effect from observational data, we need to impose untestable assumptions, which depend on the collection of all confounding factors. Since pre-event information is missing on those who die prior to t*, it is unlikely that these data are missing at random (MAR). We introduce a sensitivity analysis methodology to evaluate the robustness of SACE inferences to deviations from the MAR assumption. We apply our methodology to the evaluation of the effect of trauma center care on vitality outcomes using data from the National Study on Costs and Outcomes of Trauma Care. PMID:18759833
Cooke, Megan E; Meyers, Jacquelyn L; Latvala, Antti; Korhonen, Tellervo; Rose, Richard J; Kaprio, Jaakko; Salvatore, Jessica E; Dick, Danielle M
2015-10-01
The purpose of this study was to address two methodological issues that have called into question whether previously reported gene-environment interaction (GxE) effects for adolescent alcohol use are 'real'. These issues are (1) the potential correlation between the environmental moderator and the outcome across twins and (2) non-linear transformations of the behavioral outcome. Three environments that have been previously studied (peer deviance, parental knowledge, and potentially stressful life events) were examined here. For each moderator (peer deviance, parental knowledge, and potentially stressful life events), a series of models was fit to both a raw and transformed measure of monthly adolescent alcohol use in a sample that included 825 dizygotic (DZ) and 803 monozygotic (MZ) twin pairs. The results showed that the moderating effect of peer deviance was robust to transformation, and that although the significance of moderating effects of parental knowledge and potentially stressful life events were dependent on the scale of the adolescent alcohol use outcome, the overall results were consistent across transformation. In addition, the findings did not vary across statistical models. The consistency of the peer deviance results and the shift of the parental knowledge and potentially stressful life events results between trending and significant, shed some light on why previous findings for certain moderators have been inconsistent and emphasize the importance of considering both methodological issues and previous findings when conducting and interpreting GxE analyses.
Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor
Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.
2016-01-01
The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275
The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations
2011-12-01
modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism
Vlayen, Annemie; Verelst, Sandra; Bekkering, Geertruida E; Schrooten, Ward; Hellings, Johan; Claes, Neree
2012-04-01
Adverse events are unintended patient injuries or complications that arise from health care management resulting in death, disability or prolonged hospital stay. Adverse events that require critical care are a considerable financial burden to the health care system, but also their global impact on patients and society is probably underestimated. The objectives of this systematic review were to synthesize the best available evidence regarding the estimates of the incidence and preventability of adverse events that necessitate intensive care admission, to determine the type and consequences [mortality, length of intensive care unit (ICU) stay and costs] of these adverse events. MEDLINE (from 1966 to present), EMBASE (from 1974 to present) and CENTRAL (version 1-2010) were searched for studies reporting on unplanned admissions on ICUs. Several other sources were searched for additional studies. Only quantitative studies that used chart review for the detection of adverse events requiring intensive care admission were considered for eligibility. For the purposes of this systematic review, ICUs were defined as specialized hospital facilities which provide continuous monitoring and intensive care for acutely ill patients. Studies that were published in the English, Dutch, German, French or Spanish language were eligible for inclusion. Two reviewers independently extracted data and assessed the methodological quality of the included studies. A total of 27 studies were reviewed. Meta-analysis of the data was not appropriate because of methodological and statistical heterogeneity between studies; therefore, results are presented in a descriptive way. The percentage of surgical and medical adverse events that required ICU admission ranged from 1.1% to 37.2%. ICU readmissions varied from 0% to 18.3%. Preventability of the adverse events varied from 17% to 76.5%. Preventable adverse events are further synthesized by type of event. Consequences of the adverse events included a mean length of ICU stay that ranged from 1.5 days to 10.4 days for the patient's first stay in ICU and mortality percentages between 0% and 58%. Adverse events are an important reason for (re)admission to the ICU and a considerable proportion of these are preventable. It was not possible to estimate an overall incidence and preventability rate of these events as we found considerable heterogeneity. To decrease adverse events that necessitate ICU admission, several systems are recommended such as early detection of patients with clinical instability on general wards and the implementation of rapid response teams. Step-down or intermediate care units could be a useful strategy for patients who require monitoring to avoid ICU readmissions. However, the effectiveness of such systems needs to be investigated. © 2011 Blackwell Publishing Ltd.
Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events
USDA-ARS?s Scientific Manuscript database
Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....
Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B
2008-01-01
Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.
2017-10-01
Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Analyzing semi-competing risks data with missing cause of informative terminal event.
Zhou, Renke; Zhu, Hong; Bondy, Melissa; Ning, Jing
2017-02-28
Cancer studies frequently yield multiple event times that correspond to landmarks in disease progression, including non-terminal events (i.e., cancer recurrence) and an informative terminal event (i.e., cancer-related death). Hence, we often observe semi-competing risks data. Work on such data has focused on scenarios in which the cause of the terminal event is known. However, in some circumstances, the information on cause for patients who experience the terminal event is missing; consequently, we are not able to differentiate an informative terminal event from a non-informative terminal event. In this article, we propose a method to handle missing data regarding the cause of an informative terminal event when analyzing the semi-competing risks data. We first consider the nonparametric estimation of the survival function for the terminal event time given missing cause-of-failure data via the expectation-maximization algorithm. We then develop an estimation method for semi-competing risks data with missing cause of the terminal event, under a pre-specified semiparametric copula model. We conduct simulation studies to investigate the performance of the proposed method. We illustrate our methodology using data from a study of early-stage breast cancer. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
2012-01-01
Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes general semantic interpretation from shared task specific aspects, for biological event extraction. Our error analysis pinpoints some shortcomings, which we plan to address in future work within our incremental system development methodology. PMID:22759461
An analysis of post-event processing in social anxiety disorder.
Brozovich, Faith; Heimberg, Richard G
2008-07-01
Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.
Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu
2013-04-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.
Detection of adverse events in general surgery using the " Trigger Tool" methodology.
Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro
2015-02-01
Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Tsatsoulis, C; Amthauer, H
2003-01-01
A novel methodological approach for identifying clusters of similar medical incidents by analyzing large databases of incident reports is described. The discovery of similar events allows the identification of patterns and trends, and makes possible the prediction of future events and the establishment of barriers and best practices. Two techniques from the fields of information science and artificial intelligence have been integrated—namely, case based reasoning and information retrieval—and very good clustering accuracies have been achieved on a test data set of incident reports from transfusion medicine. This work suggests that clustering should integrate the features of an incident captured in traditional form based records together with the detailed information found in the narrative included in event reports. PMID:14645892
Moser, Barry Kurt; Halabi, Susan
2013-01-01
In this paper we develop the methodology for designing clinical trials with any factorial arrangement when the primary outcome is time to event. We provide a matrix formulation for calculating the sample size and study duration necessary to test any effect with a pre-specified type I error rate and power. Assuming that a time to event follows an exponential distribution, we describe the relationships between the effect size, the power, and the sample size. We present examples for illustration purposes. We provide a simulation study to verify the numerical calculations of the expected number of events and the duration of the trial. The change in the power produced by a reduced number of observations or by accruing no patients to certain factorial combinations is also described. PMID:25530661
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
Employing Case Study Methodology in Special Educational Settings
ERIC Educational Resources Information Center
Rouse, Angelise M.
2016-01-01
In general, case studies are a preferred strategy when "how" or "why" questions are being posed, when the investigator has little control over events, and when the focus is on a contemporary phenomenon within some real-life context (Yin, 2009). This article will examine the advantages and disadvantages of employing case study…
Teacher Scaffolding of Oral Language Production
ERIC Educational Resources Information Center
George, May G.
2011-01-01
This research involved two observational studies. It explored the scaffolding processes as part of classroom pedagogy. The research shed light on the way a teacher's instructional methodology took shape in the classroom. The target event for this study was the time in which a novice learner was engaged publicly in uttering a sentence in Arabic in…
You Sound Taller on the Telephone: A Practitioner's View of the Principalship.
ERIC Educational Resources Information Center
Dunklee, Dennis R.
This book presents a comprehensive case study of the professional life of a fictional school principal. Based entirely on case-study methodology, all the episodes discussed in the book depict actual events in everyday education leadership practice and represent a real-life education leadership experience through episodic progression. The case…
Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly
2018-02-01
The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.
Regression analysis of mixed panel count data with dependent terminal events.
Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L
2017-05-10
Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Statistical analysis of mixed recurrent event data with application to cancer survivor study
Zhu, Liang; Tong, Xingwei; Zhao, Hui; Sun, Jianguo; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L.
2014-01-01
Event history studies occur in many fields including economics, medical studies and social science. In such studies concerning some recurrent events, two types of data have been extensively discussed in the literature. One is recurrent event data that arise if study subjects are monitored or observed continuously. In this case, the observed information provides the times of all occurrences of the recurrent events of interest. The other is panel count data, which occur if the subjects are monitored or observed only periodically. This can happen if the continuous observation is too expensive or not practical and in this case, only the numbers of occurrences of the events between subsequent observation times are available. In this paper, we discuss a third type of data, which is a mixture of recurrent event and panel count data and for which there exists little literature. For regression analysis of such data, a marginal mean model is presented and we propose an estimating equation-based approach for estimation of regression parameters. A simulation study is conducted to assess the finite sample performance of the proposed methodology and indicates that it works well for practical situations. Finally it is applied to a motivating study on childhood cancer survivors. PMID:23139023
What are the hydro-meteorological controls on flood characteristics?
NASA Astrophysics Data System (ADS)
Nied, Manuela; Schröter, Kai; Lüdtke, Stefan; Nguyen, Viet Dung; Merz, Bruno
2017-02-01
Flood events can be expressed by a variety of characteristics such as flood magnitude and extent, event duration or incurred loss. Flood estimation and management may benefit from understanding how the different flood characteristics relate to the hydrological catchment conditions preceding the event and to the meteorological conditions throughout the event. In this study, we therefore propose a methodology to investigate the hydro-meteorological controls on different flood characteristics, based on the simulation of the complete flood risk chain from the flood triggering precipitation event, through runoff generation in the catchment, flood routing and possible inundation in the river system and floodplains to flood loss. Conditional cumulative distribution functions and regression tree analysis delineate the seasonal varying flood processes and indicate that the effect of the hydrological pre-conditions, i.e. soil moisture patterns, and of the meteorological conditions, i.e. weather patterns, depends on the considered flood characteristic. The methodology is exemplified for the Elbe catchment. In this catchment, the length of the build-up period, the event duration and the number of gauges undergoing at least a 10-year flood are governed by weather patterns. The affected length and the number of gauges undergoing at least a 2-year flood are however governed by soil moisture patterns. In case of flood severity and loss, the controlling factor is less pronounced. Severity is slightly governed by soil moisture patterns whereas loss is slightly governed by weather patterns. The study highlights that flood magnitude and extent arise from different flood generation processes and concludes that soil moisture patterns as well as weather patterns are not only beneficial to inform on possible flood occurrence but also on the involved flood processes and resulting flood characteristics.
Decision modeling for analyzing fire action outcomes
Donald MacGregor; Armando Gonzalez-Caban
2008-01-01
A methodology for incident decomposition and reconstruction is developed based on the concept of an "event-frame model." The event-frame model characterizes a fire incident in terms of (a) environmental events that pertain to the fire and the fire context (e.g., fire behavior, weather, fuels) and (b) management events that represent responses to the fire...
BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1
Gary Chan, Kwun Chuen; Wang, Mei-Cheng
2011-01-01
Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167
2013-01-01
Background Insomnia is a widespread human health problem, but there currently are the limitations of conventional therapies available. Suanzaoren decoction (SZRD) is a well known classic Chinese herbal prescription for insomnia and has been treating people’s insomnia for more than thousand years. The objective of this study was to evaluate the efficacy and safety of SZRD for insomnia. Methods A systematic literature search was performed for 6 databases up to July of 2012 to identify randomized control trials (RCTs) involving SZRD for insomniac patients. The methodological quality of RCTs was assessed independently using the Cochrane Handbook for Systematic Reviews of Interventions. Results Twelve RCTs with total of 1376 adult participants were identified. The methodological quality of all included trials are no more than 3/8 score. Majority of the RCTs concluded that SZRD was more significantly effective than benzodiazepines for treating insomnia. Despite these positive outcomes, there were many methodological shortcomings in the studies reviewed, including insufficient information about randomization generation and absence of allocation concealment, lack of blinding and no placebo control, absence of intention-to-treat analysis and lack of follow-ups, selective publishing and reporting, and small number of sample sizes. A number of clinical heterogeneity such as diagnosis, intervention, control, and outcome measures were also reviewed. Only 3 trials reported adverse events, whereas the other 9 trials did not provide the safety information. Conclusions Despite the apparent reported positive findings, there is insufficient evidence to support efficacy of SZRD for insomnia due to the poor methodological quality and the small number of trials of the included studies. SZRD seems generally safe, but is insufficient evidence to make conclusions on the safety because fewer studies reported the adverse events. Further large sample-size and well-designed RCTs are needed. PMID:23336848
Advances in early fetal loss research: importance for risk assessment.
Sweeney, A M; LaPorte, R E
1991-01-01
The assessment of early fetal losses (EFLs) in relationship to environmental agents offers unique advantages compared to other end points for hazard assessment. There is a high incidence (greater than 20% of all pregnancies end in an EFL), and the interval between exposure and end point is the short duration between conception and event, i.e., approximately 12 weeks. In contrast, cancer, which is the primary end point evaluated in risk assessment models, occurs with much lower frequency, and the latency period is measured in years or decades. EFLs have not been used effectively for risk assessment because most of the events are not detected. Prospective studies provide the only approach whereby it is possible to link exposure to EFLs. Recent methodologic advancements have demonstrated that it is now possible to conduct population-based studies of EFLs. It is likely that EFLs could serve as sentinels to monitor adverse health effects of many potential environmental hazards. The methodology will be demonstrated using lead exposure in utero as an example. PMID:2050056
Projecting adverse event incidence rates using empirical Bayes methodology.
Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing
2016-08-01
Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.
Butow, P N; Hiller, J E; Price, M A; Thackway, S V; Kricker, A; Tennant, C C
2000-09-01
Review empirical evidence for a relationship between psychosocial factors and breast cancer development. Standardised quality assessment criteria were utilised to assess the evidence of psychosocial predictors of breast cancer development in the following domains: (a) stressful life events, (b) coping style, (c) social support, and (d) emotional and personality factors. Few well-designed studies report any association between life events and breast cancer, the exception being two small studies using the Life Events and Difficulties Schedule (LEDS) reporting an association between severely threatening events and breast cancer risk. Seven studies show anger repression or alexithymia are predictors, the strongest evidence suggesting younger women are at increased risk. There is no evidence that social support, chronic anxiety, or depression affects breast cancer development. With the exception of rationality/anti-emotionality, personality factors do not predict breast cancer risk. The evidence for a relationship between psychosocial factors and breast cancer is weak. The strongest predictors are emotional repression and severe life events. Future research would benefit from theoretical grounding and greater methodological rigour. Recommendations are given.
ASRS Reports on Wake Vortex Encounters
NASA Technical Reports Server (NTRS)
Connell, Linda J.; Taube, Elisa Ann; Drew, Charles Robert; Barclay, Tommy Earl
2010-01-01
ASRS is conducting a structured callback research project of wake vortex incidents reported to the ASRS at all US airports, as well as wake encounters in the enroute environment. This study has three objectives: (1) Utilize the established ASRS supplemental data collection methodology and provide ongoing analysis of wake vortex encounter reports; (2) Document event dynamics and contributing factors underlying wake vortex encounter events; and (3) Support ongoing FAA efforts to address pre-emptive wake vortex risk reduction by utilizing ASRS reporting contributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angélil, Oliver; Stone, Dáithí; Wehner, Michael
The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less
Angélil, Oliver; Stone, Dáithí; Wehner, Michael; ...
2016-12-16
The annual "State of the Climate" report, published in the Bulletin of the American Meteorological Society (BAMS), has included a supplement since 2011 composed of brief analyses of the human influence on recent major extreme weather events. There are now several dozen extreme weather events examined in these supplements, but these studies have all differed in their data sources as well as their approaches to defining the events, analyzing the events, and the consideration of the role of anthropogenic emissions. This study reexamines most of these events using a single analytical approach and a single set of climate model andmore » observational data sources. In response to recent studies recommending the importance of using multiple methods for extreme weather event attribution, results are compared from these analyses to those reported in the BAMS supplements collectively, with the aim of characterizing the degree to which the lack of a common methodological framework may or may not influence overall conclusions. Results are broadly similar to those reported earlier for extreme temperature events but disagree for a number of extreme precipitation events. Based on this, it is advised that the lack of comprehensive uncertainty analysis in recent extreme weather attribution studies is important and should be considered when interpreting results, but as yet it has not introduced a systematic bias across these studies.« less
NASA Astrophysics Data System (ADS)
Bandres Motola, Miguel A.
Essay one estimates changes in small business customer energy consumption (kWh) patterns resulting from a seasonally differentiated pricing structure. Econometric analysis leverages cross-sectional time series data across the entire population of affected customers, from 2007 through the present. Observations include: monthly energy usage (kWh), relevant customer segmentations, local daily temperature, energy price, and region-specific economic conditions, among other variables. The study identifies the determinants of responsiveness to seasonal price differentiation. In addition, estimated energy consumption changes occurring during the 2010 summer season are reported for the average customer and in aggregate grouped by relevant customer segments, climate zone, and total customer base. Essay two develops an econometric modeling methodology to evaluate load impacts for short duration demand response events. The study analyzes time series data from a season of direct load control program tests aimed at integrating demand response into the wholesale electricity market. I have combined "fuzzy logic" with binary variables to create "fuzzy indicator variables" that allow for measurement of short duration events while using industry standard model specifications. Typically, binary variables for every hour are applied in load impact analysis of programs dispatched in hourly intervals. As programs evolve towards integration with the wholesale market, event durations become irregular and often occur for periods of only a few minutes. This methodology is innovative in that it conserves the degrees of freedom in the model while allowing for analysis of high frequency data using fixed effects. Essay three examines the effects of strategies, intangibles, and FDA news on the stocks of young biopharmaceutical firms. An event study methodology is used to explore those effects. This study investigates 20,839 announcements from 1990 to 2005. Announcements on drug development, alliances, publications, presentations, and FDA approval have a positive effect on the short-term performance of young biopharmaceutical firms. Announcements on goals not met, FDA drug approval denied, and changes in structural organizations have a negative effect on the short-term performance of young biopharmaceutical firms.
Chen, Xin-Lin; Mo, Chuan-Wei; Lu, Li-Ya; Gao, Ri-Yang; Xu, Qian; Wu, Min-Feng; Zhou, Qian-Yi; Hu, Yue; Zhou, Xuan; Li, Xian-Tao
2017-11-01
To assess the methodological quality of systematic reviews and meta-analyses regarding acupuncture intervention for stroke and the primary studies within them. Two researchers searched PubMed, Cumulative index to Nursing and Allied Health Literature, Embase, ISI Web of Knowledge, Cochrane, Allied and Complementary Medicine, Ovid Medline, Chinese Biomedical Literature Database, China National Knowledge Infrastructure, Wanfang and Traditional Chinese Medical Database to identify systematic reviews and meta-analyses about acupuncture for stroke published from the inception to December 2016. Review characteristics and the criteria for assessing the primary studies within reviews were extracted. The methodological quality of the reviews was assessed using adapted Oxman and Guyatt Scale. The methodological quality of primary studies was also assessed. Thirty-two eligible reviews were identified, 15 in English and 17 in Chinese. The English reviews were scored higher than the Chinese reviews (P=0.025), especially in criteria for avoiding bias and the scope of search. All reviews used the quality criteria to evaluate the methodological quality of primary studies, but some criteria were not comprehensive. The primary studies, in particular the Chinese reviews, had problems with randomization, allocation concealment, blinding, dropouts and withdrawals, intent-to-treat analysis and adverse events. Important methodological flaws were found in Chinese systematic reviews and primary studies. It was necessary to improve the methodological quality and reporting quality of both the systematic reviews published in China and primary studies on acupuncture for stroke.
Peak flood estimation using gene expression programming
NASA Astrophysics Data System (ADS)
Zorn, Conrad R.; Shamseldin, Asaad Y.
2015-12-01
As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.
NASA Astrophysics Data System (ADS)
Bruzzone, Agostino G.; Revetria, Roberto; Simeoni, Simone; Viazzo, Simone; Orsoni, Alessandra
2004-08-01
In logistics and industrial production managers must deal with the impact of stochastic events to improve performances and reduce costs. In fact, production and logistics systems are generally designed considering some parameters as deterministically distributed. While this assumption is mostly used for preliminary prototyping, it is sometimes also retained during the final design stage, and especially for estimated parameters (i.e. Market Request). The proposed methodology can determine the impact of stochastic events in the system by evaluating the chaotic threshold level. Such an approach, based on the application of a new and innovative methodology, can be implemented to find the condition under which chaos makes the system become uncontrollable. Starting from problem identification and risk assessment, several classification techniques are used to carry out an effect analysis and contingency plan estimation. In this paper the authors illustrate the methodology with respect to a real industrial case: a production problem related to the logistics of distributed chemical processing.
Single Event Burnout in DC-DC Converters for the LHC Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claudio H. Rivetta et al.
High voltage transistors in DC-DC converters are prone to catastrophic Single Event Burnout in the LHC radiation environment. This paper presents a systematic methodology to analyze single event effects sensitivity in converters and proposes solutions based on de-rating input voltage and output current or voltage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scrucca, Flavio; Severi, Claudio; Galvan, Nicola
Nowadays an increasing attention of public and private agencies to the sustainability performance of events is observed, since it is recognized as a key issue in the context of sustainable development. Assessing the sustainability performance of events involves environmental, social and economic aspects; their impacts are complex and a quantitative assessment is often difficult. This paper presents a new quali-quantitative method developed to measure the sustainability of events, taking into account all its potential impacts. The 2014 World Orienteering Championship, held in Italy, was selected to test the proposed evaluation methodology. The total carbon footprint of the event was 165.34more » tCO{sub 2}eq and the avoided emissions were estimated as being 46 tCO{sub 2}eq. The adopted quali-quantitative method resulted to be efficient in assessing the sustainability impacts and can be applied for the evaluation of similar events. - Highlights: • A quali-quantitative method to assess events' sustainability is presented. • All the methodological issues related to the method are explained. • The method is used to evaluate the sustainability of an international sports event. • The method resulted to be valid to assess the event's sustainability level. • The carbon footprint of the event has been calculated.« less
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
The IMI PROTECT project: purpose, organizational structure, and procedures.
Reynolds, Robert F; Kurz, Xavier; de Groot, Mark C H; Schlienger, Raymond G; Grimaldi-Bensouda, Lamiae; Tcherny-Lessenot, Stephanie; Klungel, Olaf H
2016-03-01
The Pharmacoepidemiological Research on Outcomes of Therapeutics by a European ConsorTium (PROTECT) initiative was a collaborative European project that sought to address limitations of current methods in the field of pharmacoepidemiology and pharmacovigilance. Initiated in 2009 and ending in 2015, PROTECT was part of the Innovative Medicines Initiative, a joint undertaking by the European Union and pharmaceutical industry. Thirty-five partners including academics, regulators, small and medium enterprises, and European Federation of Pharmaceuticals Industries and Associations companies contributed to PROTECT. Two work packages within PROTECT implemented research examining the extent to which differences in the study design, methodology, and choice of data source can contribute to producing discrepant results from observational studies on drug safety. To evaluate the effect of these differences, the project applied different designs and analytic methodology for six drug-adverse event pairs across several electronic healthcare databases and registries. This papers introduces the organizational structure and procedures of PROTECT, including how drug-adverse event and data sources were selected, study design and analyses documents were developed, and results managed centrally. Copyright © 2016 John Wiley & Sons, Ltd.
Investigating Nanoscale Electrochemistry with Surface- and Tip-Enhanced Raman Spectroscopy.
Zaleski, Stephanie; Wilson, Andrew J; Mattei, Michael; Chen, Xu; Goubert, Guillaume; Cardinal, M Fernanda; Willets, Katherine A; Van Duyne, Richard P
2016-09-20
The chemical sensitivity of surface-enhanced Raman spectroscopy (SERS) methodologies allows for the investigation of heterogeneous chemical reactions with high sensitivity. Specifically, SERS methodologies are well-suited to study electron transfer (ET) reactions, which lie at the heart of numerous fundamental processes: electrocatalysis, solar energy conversion, energy storage in batteries, and biological events such as photosynthesis. Heterogeneous ET reactions are commonly monitored by electrochemical methods such as cyclic voltammetry, observing billions of electrochemical events per second. Since the first proof of detecting single molecules by redox cycling, there has been growing interest in examining electrochemistry at the nanoscale and single-molecule levels. Doing so unravels details that would otherwise be obscured by an ensemble experiment. The use of optical spectroscopies, such as SERS, to elucidate nanoscale electrochemical behavior is an attractive alternative to traditional approaches such as scanning electrochemical microscopy (SECM). While techniques such as single-molecule fluorescence or electrogenerated chemiluminescence have been used to optically monitor electrochemical events, SERS methodologies, in particular, have shown great promise for exploring electrochemistry at the nanoscale. SERS is ideally suited to study nanoscale electrochemistry because the Raman-enhancing metallic, nanoscale substrate duly serves as the working electrode material. Moreover, SERS has the ability to directly probe single molecules without redox cycling and can achieve nanoscale spatial resolution in combination with super-resolution or scanning probe microscopies. This Account summarizes the latest progress from the Van Duyne and Willets groups toward understanding nanoelectrochemistry using Raman spectroscopic methodologies. The first half of this Account highlights three techniques that have been recently used to probe few- or single-molecule electrochemical events: single-molecule SERS (SMSERS), superlocalization SERS imaging, and tip-enhanced Raman spectroscopy (TERS). While all of the studies we discuss probe model redox dye systems, the experiments described herein push the study of nanoscale electrochemistry toward the fundamental limit, in terms of both chemical sensitivity and spatial resolution. The second half of this Account discusses current experimental strategies for studying nanoelectrochemistry with SERS techniques, which includes relevant electrochemically and optically active molecules, substrates, and substrate functionalization methods. In particular, we highlight the wide variety of SERS-active substrates and optically active molecules that can be implemented for EC-SERS, as well as the need to carefully characterize both the electrochemistry and resultant EC-SERS response of each new redox-active molecule studied. Finally, we conclude this Account with our perspective on the future directions of studying nanoscale electrochemistry with SERS/TERS, which includes the integration of SECM with TERS and the use of theoretical methods to further describe the fundamental intricacies of single-molecule, single-site electrochemistry at the nanoscale.
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.
ERIC Educational Resources Information Center
Whitelaw, Paul A.; Wrathall, Jeffrey
2015-01-01
Purpose: The purpose of this paper is to reflect upon the stakeholder, scholarly, academic and jurisdictional influences on course development for a vocationally oriented bachelor's degree. Design/methodology/approach: This paper takes the form of a case study. Findings: Vocationally oriented bachelor's courses can be developed, especially when…
Azoulay, Laurent; Suissa, Samy
2017-05-01
Recent randomized trials have compared the newer antidiabetic agents to treatments involving sulfonylureas, drugs associated with increased cardiovascular risks and mortality in some observational studies with conflicting results. We reviewed the methodology of these observational studies by searching MEDLINE from inception to December 2015 for all studies of the association between sulfonylureas and cardiovascular events or mortality. Each study was appraised with respect to the comparator, the outcome, and study design-related sources of bias. A meta-regression analysis was used to evaluate heterogeneity. A total of 19 studies were identified, of which six had no major design-related biases. Sulfonylureas were associated with an increased risk of cardiovascular events and mortality in five of these studies (relative risks 1.16-1.55). Overall, the 19 studies resulted in 36 relative risks as some studies assessed multiple outcomes or comparators. Of the 36 analyses, metformin was the comparator in 27 (75%) and death was the outcome in 24 (67%). The relative risk was higher by 13% when the comparator was metformin, by 20% when death was the outcome, and by 7% when the studies had design-related biases. The lowest predicted relative risk was for studies with no major bias, comparator other than metformin, and cardiovascular outcome (1.06 [95% CI 0.92-1.23]), whereas the highest was for studies with bias, metformin comparator, and mortality outcome (1.53 [95% CI 1.43-1.65]). In summary, sulfonylureas were associated with an increased risk of cardiovascular events and mortality in the majority of studies with no major design-related biases. Among studies with important biases, the association varied significantly with respect to the comparator, the outcome, and the type of bias. With the introduction of new antidiabetic drugs, the use of appropriate design and analytical tools will provide their more accurate cardiovascular safety assessment in the real-world setting. © 2017 by the American Diabetes Association.
DOT National Transportation Integrated Search
2016-09-01
This project applies a decision analytic methodology that takes considerations of extreme weather events to quantify and assess canopy investment options. The project collected data for two cases studies in two different transit agencies: Chicago Tra...
The Relationship between State Policy Levers and Student Mobility
ERIC Educational Resources Information Center
Gross, Jacob P. K.; Berry, Matthew S.
2016-01-01
To address conceptual and methodological shortcomings in the extant literature on student mobility, this study employs event history modeling to describe and explain how state policy levers, specifically state grant aid, relates to mobility and baccalaureate degree completion. We find that state grant aid reduces mobility, but less so than…
Adjustment to Widowhood and Divorce: A Review.
ERIC Educational Resources Information Center
Kitson, Gay C.; And Others
1989-01-01
Examines studies of adjustment to widowhood and/or divorce and points out those places where findings are similar or different. Explores impact upon adjustment of cause of death or divorce, timing of event, demographic correlates, economic issues, social support, and attachment. Concludes with discussion of methodological issues and topics for…
Affective Responses of Students Who Witness Classroom Cheating
ERIC Educational Resources Information Center
Firmin, Michael W.; Burger, Amanda; Blosser, Matthew
2009-01-01
For this study, 82 general psychology students (51 females, 31 males) witnessed a peer cheating while completing a test. Following the incident, we tape recorded semi-structured interviews with each student who saw the cheating event for later analysis. Using qualitative coding and methodology, themes emerged regarding students' emotional…
NASA Astrophysics Data System (ADS)
Omira, R.; Matias, L.; Baptista, M. A.
2016-12-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth; Campola, Michael; Xapsos, Michael
2017-01-01
We are investigating the application of classical reliability performance metrics combined with standard single event upset (SEU) analysis data. We expect to relate SEU behavior to system performance requirements. Our proposed methodology will provide better prediction of SEU responses in harsh radiation environments with confidence metrics. single event upset (SEU), single event effect (SEE), field programmable gate array devises (FPGAs)
Dimension Determination of Precursive Stall Events in a Single Stage High Speed Compressor
NASA Technical Reports Server (NTRS)
Bright, Michelle M.; Qammar, Helen K.; Hartley, Tom T.
1996-01-01
This paper presents a study of the dynamics for a single-stage, axial-flow, high speed compressor core, specifically, the NASA Lewis rotor stage 37. Due to the overall blading design for this advanced core compressor, each stage has considerable tip loading and higher speed than most compressor designs, thus, the compressor operates closer to the stall margin. The onset of rotating stall is explained as bifurcations in the dynamics of axial compressors. Data taken from the compressor during a rotating stall event is analyzed. Through the use of a box-assisted correlation dimension methodology, the attractor dimension is determined during the bifurcations leading to rotating stall. The intent of this study is to examine the behavior of precursive stall events so as to predict the entrance into rotating stall. This information may provide a better means to identify, avoid or control the undesirable event of rotating stall formation in high speed compressor cores.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durant, W.S.; robinette, R.J.; Kirchner, J.R.
1994-03-01
In essence, this study was envisioned as the ``combination`` of existing accident dose and risk calculations from safety analyses of individual facilities. However, because of the extended time period over which the safety analyses were prepared, calculational assumptions and methodologies differed between the analyses. The scope of this study therefore included the standardization of assumptions and calculations as necessary to insure that the analytical logic was consistent for all the facilities. Each of the nonseismic external events considered in the analyses are addressed in individual sections in this report. In Section 2, extreme straight-line winds are examined. Section 3 addressesmore » tornadoes, and Section 4 addresses other external events [floods, other extreme weather events (lightning, hail, and extremes in temperature or precipitation), vehicle impact, accidents involving adjacent facilities, aircraft impact, and meteorite impact]. Section 5 provides a summary of the general conclusions of the report.« less
Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.
2013-01-01
Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689
Issues in Researching Self-Regulated Learning as Patterns of Events
ERIC Educational Resources Information Center
Winne, Philip H.
2014-01-01
New methods for gathering and analyzing data about events that comprise self-regulated learning (SRL) support discoveries about patterns among events and tests of hypotheses about roles patterns play in learning. Five such methodologies are discussed in the context of four key questions that shape investigations into patterns in SRL. A framework…
Application of Ground Based Microwave Radiometry for Characterizing Tropical Convection
NASA Astrophysics Data System (ADS)
Renju, R.; Raju, C. S.
2016-12-01
The characterization of the microphysical and thermodynamical properties of convective events over the tropical coastal station Thiruvananthapuram (TVM, 8.5o N 76.9oE) has been carried out by utilizing multiyear Microwave Radiometer Profiler (MRP) observations. The analyses have been extended to develop a methodology to identify convective events, based on the radiometric brightness temperature (Tb) differences, at 30 GHz and 22.5 GHz channels and are compared using reflectivity and rainfall intensity deduced from concurrent and collocated disdrometer measurements. In all 84 such convections were identified using the above methodology over the station for a period of years, 2010-2013; both during pre- and post- Indian summer monsoon months and further evaluated by computing their stability indices. The occurrence of convection over this coastal station peaks in the afternoon and early morning hours with genesis, respectively, over the land and the sea. The number of occurrence of convective events are less during monsoon deficit year whereas strong and more during heavy monsoon rainfall year. These findings are further evaluated with the percentage occurrence of fractional convective clouds derived from microwave payload SAPHIR observations on Megha-Tropique satellite. Based on the analyses the frequency of occurrence of convection can be related to the monsoonal rainfall obtaining over the region. The analyses also indicate that the microwave radiometric brightness temperature of humidity channels depicts the type of convection and respond two hours prior to the occurrence of rainfall. In addition to that the multi-angle observations of microwave radiometer profiler have been utilized to study the propagation of convective systems. This study and the methodology developed for identifying convection have significance in microwave (Ka- and W-band) satellite propagation characterization since convection and precipitation are the major hindrance to satellite communication over the tropical region.
A dimensionless approach for the runoff peak assessment: effects of the rainfall event structure
NASA Astrophysics Data System (ADS)
Gnecco, Ilaria; Palla, Anna; La Barbera, Paolo
2018-02-01
The present paper proposes a dimensionless analytical framework to investigate the impact of the rainfall event structure on the hydrograph peak. To this end a methodology to describe the rainfall event structure is proposed based on the similarity with the depth-duration-frequency (DDF) curves. The rainfall input consists of a constant hyetograph where all the possible outcomes in the sample space of the rainfall structures can be condensed. Soil abstractions are modelled using the Soil Conservation Service method and the instantaneous unit hydrograph theory is undertaken to determine the dimensionless form of the hydrograph; the two-parameter gamma distribution is selected to test the proposed methodology. The dimensionless approach is introduced in order to implement the analytical framework to any study case (i.e. natural catchment) for which the model assumptions are valid (i.e. linear causative and time-invariant system). A set of analytical expressions are derived in the case of a constant-intensity hyetograph to assess the maximum runoff peak with respect to a given rainfall event structure irrespective of the specific catchment (such as the return period associated with the reference rainfall event). Looking at the results, the curve of the maximum values of the runoff peak reveals a local minimum point corresponding to the design hyetograph derived according to the statistical DDF curve. A specific catchment application is discussed in order to point out the dimensionless procedure implications and to provide some numerical examples of the rainfall structures with respect to observed rainfall events; finally their effects on the hydrograph peak are examined.
An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.
Nguyen, Ngan; Watson, William D; Dominguez, Edward
2016-01-01
Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
The cost of nurse-sensitive adverse events.
Pappas, Sharon Holcombe
2008-05-01
The aim of this study was to describe the methodology for nursing leaders to determine the cost of adverse events and effective levels of nurse staffing. The growing transparency of quality and cost outcomes motivates healthcare leaders to optimize the effectiveness of nurse staffing. Most hospitals have robust cost accounting systems that provide actual patient-level direct costs. These systems allow an analysis of the cost consumed by patients during a hospital stay. By knowing the cost of complications, leaders have the ability to justify the cost of improved staffing when quality evidence shows that higher nurse staffing improves quality. An analysis was performed on financial and clinical data from hospital databases of 3,200 inpatients. The purpose was to establish a methodology to determine actual cost per case. Three diagnosis-related groups were the focus of the analysis. Five adverse events were analyzed along with the costs. A regression analysis reported that the actual direct cost of an adverse event was dollars 1,029 per case in the congestive heart failure cases and dollars 903 in the surgical cases. There was a significant increase in the cost per case in medical patients with urinary tract infection and pressure ulcers and in surgical patients with urinary tract infection and pneumonia. The odds of pneumonia occurring in surgical patients decreased with additional registered nurse hours per patient day. Hospital cost accounting systems are useful in determining the cost of adverse events and can aid in decision making about nurse staffing. Adverse events add costs to patient care and should be measured at the unit level to adjust staffing to reduce adverse events and avoid costs.
Edmundson, Sarah; Stuenkel, Diane L; Connolly, Phyllis M
2005-09-01
Anticoagulation therapy is a life-enhancing therapy for patients who are at risk for embolic events secondary to atrial fibrillation, valve replacement, and other comorbidities. Clinicians are motivated to decrease the amount of time that patients are either under- or over-anticoagulated, common conditions that decrease patient safety at either extreme. The primary purpose of this descriptive study was to examine the relationship between personal life event factors as measured by Norbeck's Life Events Questionnaire, core demographics such as age and income, and anticoagulation regulation. Although many factors affect anticoagulation therapy, the precise impact of life events, positive or negative, is unknown. The salient findings of this study (n = 202) showed a small, though statistically significant, inverse relationship (r = -0.184, P < .01) between negative life events and decreased time within therapeutic international normalized ratio. Total Life Event scores showed a statistically significant inverse relationship (r = -0.159, P < .05) to international normalized ratio time within therapeutic level. Lower income was inversely associated with higher negative Life Event scores (r = -0.192, P < .01). The findings demonstrate the need for strategies that address the potential impact of life events in conjunction with coexisting screening measures used in anticoagulation clinics. Implications for this study are limited by lack of methodology documenting concurrent social support factors and limitations of the research tool to reflect life event issues specific to outpatient seniors.
NASA Astrophysics Data System (ADS)
Lawler, D. M.
2008-01-01
In most episodic erosion and deposition systems, knowledge of the timing of geomorphological change, in relation to fluctuations in the driving forces, is crucial to strong erosion process inference, and model building, validation and development. A challenge for geomorphology, however, is that few studies have focused on geomorphological event structure (timing, magnitude, frequency and duration of individual erosion and deposition events), in relation to applied stresses, because of the absence of key monitoring methodologies. This paper therefore (a) presents full details of a new erosion and deposition measurement system — PEEP-3T — developed from the Photo-Electronic Erosion Pin sensor in five key areas, including the addition of nocturnal monitoring through the integration of the Thermal Consonance Timing (TCT) concept, to produce a continuous sensing system; (b) presents novel high-resolution datasets from the redesigned PEEP-3T system for river bank system of the Rivers Nidd and Wharfe, northern England, UK; and (c) comments on their potential for wider application throughout geomorphology to address these key measurement challenges. Relative to manual methods of erosion and deposition quantification, continuous PEEP-3T methodologies increase the temporal resolution of erosion/deposition event detection by more than three orders of magnitude (better than 1-second resolution if required), and this facility can significantly enhance process inference. Results show that river banks are highly dynamic thermally and respond quickly to radiation inputs. Data on bank retreat timing, fixed with PEEP-3T TCT evidence, confirmed that they were significantly delayed up to 55 h after flood peaks. One event occurred 13 h after emergence from the flow. This suggests that mass failure processes rather than fluid entrainment dominated the system. It is also shown how, by integrating turbidity instrumentation with TCT ideas, linkages between sediment supply and sediment flux can be forged at event timescales, and a lack of sediment exhaustion was evident here. Five challenges for wider geomorphological process investigation are discussed. This event-based dynamics approach, based on continuous monitoring methodologies, appears to have considerable wider potential for stronger process inference and model testing and validation in many areas of geomorphology.
Kirby, K C; Bickel, W K
1995-01-01
We review four articles from JEAB's March 1994 issue celebrating the contributions of Joseph V. Brady. These articles have implications for studying private events and for studying multiple operants. We suggest that regularly including self-reports about private events in behavioral pharmacological research has resulted in an accumulated knowledge that has facilitated examination of interesting relations among self-reports, environmental factors, and other observable behaviors. Methodological lessons that behavioral pharmacologists have learned regarding the study of multiple operants are also relayed. We provide examples of how these lessons could be useful to applied behavior analysts studying nonpharmacological issues. PMID:7706145
Network hydraulics inclusion in water quality event detection using multiple sensor stations data.
Oliker, Nurit; Ostfeld, Avi
2015-09-01
Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
Cooke, Megan E.; Meyers, Jacquelyn L.; Latvala, Antti; Korhonen, Tellervo; Rose, Richard J.; Kaprio, Jaakko; Salvatore, Jessica E.; Dick, Danielle M.
2016-01-01
The purpose of this study was to address two methodological issues that have called into question whether previously reported gene-environment interaction (GxE) effects for adolescent alcohol use are “real.” These issues are (1) the potential correlation between the environmental moderator and the outcome across twins and (2) non-linear transformations of the behavioral outcome. Three environments that have been previously reported on (peer deviance, parental knowledge, and potentially stressful life events) were examined here. For each moderator (peer deviance, parental knowledge, and potentially stressful life events), a series of models was fit to both a raw and transformed measure of monthly adolescent alcohol use in a sample that included 825 DZ and 803 MZ twin pairs. The results showed that the moderating effect of peer deviance was robust to transformation, and that although the significance of moderating effects of parental knowledge and potentially stressful life events were dependent on the scale of the adolescent alcohol use outcome, the overall results were consistent across transformation. In addition, the findings did not vary across statistical models. The consistency of the peer deviance results and the shift of the parental knowledge and potentially stressful life events results between trending and significant, shed some light on why previous findings for certain moderators have been inconsistent and emphasize the importance of considering both methodological issues and previous findings when conducting and interpreting GxE analyses. PMID:26290350
The Effect of Interruptions on Part 121 Air Carrier Operations
NASA Technical Reports Server (NTRS)
Damos, Diane L.
1998-01-01
The primary purpose of this study was to determine the relative priorities of various events and activities by examining the probability that a given activity was interrupted by a given event. The analysis will begin by providing frequency of interruption data by crew position (captain versus first officer) and event type. Any differences in the pattern of interruptions between the first officers and the captains will be explored and interpreted in terms of standard operating procedures. Subsequent data analyses will focus on comparing the frequency of interruptions for different types of activities and for the same activities under normal versus emergency conditions. Briefings and checklists will receive particular attention. The frequency with which specific activities are interrupted under multiple- versus single-task conditions also will be examined; because the majority of multiple-task data were obtained under laboratory conditions, LOFT-type tapes offer a unique opportunity to examine concurrent task performance under 'real-world' conditions. A second purpose of this study is to examine the effects of the interruptions on performance. More specifically, when possible, the time to resume specific activities will be compared to determine if pilots are slower to resume certain types of activities. Errors in resumption or failures to resume specific activities will be noted and any patterns in these errors will be identified. Again, particular attention will be given to the effects of interruptions on the completion of checklists and briefings. Other types of errors and missed events (i.e., the crew should have responded to the event but did not) will be examined. Any methodology using interruptions to examine task prioritization must be able to identify when an interruption has occurred and describe the ongoing activities that were interrupted. Both of these methodological problems are discussed In detail in the following section,
Using NIAM to capture time dependencies in a domain of discourse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, S.D.
1994-07-01
This paper addresses the issues surrounding the use of NIAM to capture time dependencies in a domain of discourse. The NIAM concepts that support capturing time dependencies are in the event and process portions of the NIAM metamodel, which are the portions most poorly supported by a well-established methodology. This lack of methodological support is a potentially serious handicap in any attempt to apply NIAM to a domain of discourse in which time dependencies are a central issue. However, the capability that NIAM provides for validating and verifying the elementary facts in the domain may reduce the magnitude of themore » event/process-specification task to a level at which it could be effectively handled even without strong methodological support.« less
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
Baselining PMU Data to Find Patterns and Anomalies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.
This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typicalmore » patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.« less
Wei, Ching-Yun; Quek, Ruben G W; Villa, Guillermo; Gandra, Shravanthi R; Forbes, Carol A; Ryder, Steve; Armstrong, Nigel; Deshpande, Sohan; Duffy, Steven; Kleijnen, Jos; Lindgren, Peter
2017-03-01
Previous reviews have evaluated economic analyses of lipid-lowering therapies using lipid levels as surrogate markers for cardiovascular disease. However, drug approval and health technology assessment agencies have stressed that surrogates should only be used in the absence of clinical endpoints. The aim of this systematic review was to identify and summarise the methodologies, weaknesses and strengths of economic models based on atherosclerotic cardiovascular disease event rates. Cost-effectiveness evaluations of lipid-lowering therapies using cardiovascular event rates in adults with hyperlipidaemia were sought in Medline, Embase, Medline In-Process, PubMed and NHS EED and conference proceedings. Search results were independently screened, extracted and quality checked by two reviewers. Searches until February 2016 retrieved 3443 records, from which 26 studies (29 publications) were selected. Twenty-two studies evaluated secondary prevention (four also assessed primary prevention), two considered only primary prevention and two included mixed primary and secondary prevention populations. Most studies (18) based treatment-effect estimates on single trials, although more recent evaluations deployed meta-analyses (5/10 over the last 10 years). Markov models (14 studies) were most commonly used and only one study employed discrete event simulation. Models varied particularly in terms of health states and treatment-effect duration. No studies used a systematic review to obtain utilities. Most studies took a healthcare perspective (21/26) and sourced resource use from key trials instead of local data. Overall, reporting quality was suboptimal. This review reveals methodological changes over time, but reporting weaknesses remain, particularly with respect to transparency of model reporting.
Economic Evaluations of the Health Impacts of Weather-Related Extreme Events: A Scoping Review
Schmitt, Laetitia H. M.; Graham, Hilary M.; White, Piran C. L.
2016-01-01
The frequency and severity of extreme events is expected to increase under climate change. There is a need to understand the economic consequences of human exposure to these extreme events, to underpin decisions on risk reduction. We undertook a scoping review of economic evaluations of the adverse health effects from exposure to weather-related extreme events. We searched PubMed, Embase and Web of Science databases with no restrictions to the type of evaluations. Twenty studies were included, most of which were recently published. Most studies have been undertaken in the U.S. (nine studies) or Asia (seven studies), whereas we found no studies in Africa, Central and Latin America nor the Middle East. Extreme temperatures accounted for more than a third of the pool of studies (seven studies), closely followed by flooding (six studies). No economic study was found on drought. Whilst studies were heterogeneous in terms of objectives and methodology, they clearly indicate that extreme events will become a pressing public health issue with strong welfare and distributional implications. The current body of evidence, however, provides little information to support decisions on the allocation of scarce resources between risk reduction options. In particular, the review highlights a significant lack of research attention to the potential cost-effectiveness of interventions that exploit the capacity of natural ecosystems to reduce our exposure to, or ameliorate the consequences of, extreme events. PMID:27834843
Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.
Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W
2017-09-01
An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.
Lessing, C; Schmitz, A; Schrappe, M
2012-02-01
The Harvard Medical Practice (HMP) Design is based on a multi-staged retrospective review of inpatient records and is used to assess the frequency of (preventable) adverse events ([P]AE) in large study populations. Up to now HMP studies have been conducted in 9 countries. Results differ largely from 2.9% to 3.7% of patients with AE in the USA up to 16.6% in Australia. In our analysis we systematically compare the methodology of 9 HMP studies published in the English language and discuss possible impacts on reported frequencies. Modifications in HMP studies can be individualised from each stage of planning, conducting, and reporting results. In doing so 2 studies from the USA with lowest rates of AE can be characterised by their context of liability and the absence of screening for nosocomial infections. Studies with a high proportion of AE are marked by an intense training of reviewers. Further conclusions are hindered by divergences in defining periods of observation, by presenting frequencies as cumulative prevalences, and differences in the reporting of study results. As a consequence future HMP studies should go for complete, consistent and transparent coverage. Further research should concentrate on advancing methods for collecting data on (P)AE. © Georg Thieme Verlag KG Stuttgart · New York.
Event-driven management algorithm of an Engineering documents circulation system
NASA Astrophysics Data System (ADS)
Kuzenkov, V.; Zebzeev, A.; Gromakov, E.
2015-04-01
Development methodology of an engineering documents circulation system in the design company is reviewed. Discrete event-driven automatic models using description algorithms of project management is offered. Petri net use for dynamic design of projects is offered.
Impact Evaluation of the U.S. Department of Energy's Solar Decathlon Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnes, Harley
2012-12-01
This report includes the methodology and findings in evaluating DOE’s Solar Decathlon event. The primary purpose of this evaluation is to learn how effectively the Solar Decathlon event is in meeting its objectives.
Do prescription stimulants increase the risk of adverse cardiovascular events?: A systematic review
2012-01-01
Background There is increasing concern that prescription stimulants may be associated with adverse cardiovascular events such as stroke, myocardial infarction, and sudden death. Public health concerns are amplified by increasing use of prescription stimulants among adults. Methods The objective of this study was to conduct a systematic review of the evidence of an association between prescription stimulant use and adverse cardiovascular outcomes. PUBMED, MEDLINE, EMBASE and Google Scholar searches were conducted using key words related to these topics (MESH): ADHD; Adults; Amphetamine; Amphetamines; Arrhythmias, Cardiac; Cardiovascular Diseases; Cardiovascular System; Central Nervous Stimulants; Cerebrovascular; Cohort Studies; Case–control Studies; Death; Death, Sudden, Cardiac; Dextroamphetamine; Drug Toxicity; Methamphetamine; Methylphenidate; Myocardial Infarction; Stimulant; Stroke; Safety. Eligible studies were population-based studies of children, adolescents, or adults using prescription stimulant use as the independent variable and a hard cardiovascular outcome as the dependent variable. Results Ten population-based observational studies which evaluated prescription stimulant use with cardiovascular outcomes were reviewed. Six out of seven studies in children and adolescents did not show an association between stimulant use and adverse cardiovascular outcomes. In contrast, two out of three studies in adults found an association. Conclusions Findings of an association between prescription stimulant use and adverse cardiovascular outcomes are mixed. Studies of children and adolescents suggest that statistical power is limited in available study populations, and the absolute risk of an event is low. More suggestive of a safety signal, studies of adults found an increased risk for transient ischemic attack and sudden death/ventricular arrhythmia. Interpretation was limited due to differences in population, cardiovascular outcome selection/ascertainment, and methodology. Accounting for confounding and selection biases in these studies is of particular concern. Future studies should address this and other methodological issues. PMID:22682429
Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver
NASA Astrophysics Data System (ADS)
Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.
2017-08-01
The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.
Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F.; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P.; McCrae, John; McCorkindale, Sheila; Leather, David
2016-01-01
Abstract Background The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once‐daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. Objective The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in‐depth exploration of the safety results will be the subject of future publications. Achievements The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Conclusion Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. PMID:27804174
Collier, Sue; Harvey, Catherine; Brewster, Jill; Bakerly, Nawar Diar; Elkhenini, Hanaa F; Stanciu, Roxana; Williams, Claire; Brereton, Jacqui; New, John P; McCrae, John; McCorkindale, Sheila; Leather, David
2017-03-01
The Salford Lung Study (SLS) programme, encompassing two phase III pragmatic randomised controlled trials, was designed to generate evidence on the effectiveness of a once-daily treatment for asthma and chronic obstructive pulmonary disease in routine primary care using electronic health records. The objective of this study was to describe and discuss the safety monitoring methodology and the challenges associated with ensuring patient safety in the SLS. Refinements to safety monitoring processes and infrastructure are also discussed. The study results are outside the remit of this paper. The results of the COPD study were published recently and a more in-depth exploration of the safety results will be the subject of future publications. The SLS used a linked database system to capture relevant data from primary care practices in Salford and South Manchester, two university hospitals and other national databases. Patient data were collated and analysed to create daily summaries that were used to alert a specialist safety team to potential safety events. Clinical research teams at participating general practitioner sites and pharmacies also captured safety events during routine consultations. Confidence in the safety monitoring processes over time allowed the methodology to be refined and streamlined without compromising patient safety or the timely collection of data. The information technology infrastructure also allowed additional details of safety information to be collected. Integration of multiple data sources in the SLS may provide more comprehensive safety information than usually collected in standard randomised controlled trials. Application of the principles of safety monitoring methodology from the SLS could facilitate safety monitoring processes for future pragmatic randomised controlled trials and yield important complementary safety and effectiveness data. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd. © 2016 The Authors Pharmacoepidemiology and Drug Safety Published by John Wiley & Sons Ltd.
1981-01-01
Objective records of the occurrence of menstrual bleeding were compared with women's subjective assessments of the timing and duration of these events. The number of days a woman experienced bleeding during each episode was relatively constant; however, the length of the bleeding episode varied greatly among the 13 cultures studies. A greater understanding of menstrual patterns is possible if the pattern is seen as a succession of discrete events rather than as a whole. A more careful use of terminology relating to these discrete events would provide greater understanding of menstruation for the woman concerned and those advising her. The methodology employed in the collection of data about menstrual events among illiterate women is described and suggestions given as to how such information can be most efficiently obtained.
Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.
2015-01-01
Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405
Learning Authentic Leadership in New Zealand: A Learner-Centred Methodology and Evaluation
ERIC Educational Resources Information Center
Roche, Maree
2010-01-01
This study provides preliminary examination of the efficacy of the "Best Authentic Leadership Self" exercise. A field quasi-experimental design was conducted with a dual purpose: (1) to ascertain the value of interventions aimed at triggering events to enhance the learning (c.f. teaching) of "authentic leadership? and how this…
Expanding Downward: Innovation, Diffusion, and State Policy Adoptions of Universal Preschool
ERIC Educational Resources Information Center
Curran, F. Chris
2015-01-01
Framed within the theoretical framework of policy innovation and diffusion, this study explores both interstate (diffusion) and intrastate predictors of adoption of state universal preschool policies. Event history analysis methodology is applied to a state level dataset drawn from the Census, the NCES Common Core, the Book of the States, and…
A Longitudinal Analysis of PTSD Symptom Course: Delayed-Onset PTSD in Somalia Peacekeepers
ERIC Educational Resources Information Center
Gray, Matt J.; Bolton, Elisa E.; Litz, Brett T.
2004-01-01
Posttraumatic stress disorder (PTSD) typically follows an acute to chronic course. However, some trauma victims do not report significant symptoms until a period of time has elapsed after the event. Although originally dismissed as an artifact of retrospective methodologies, recent prospective studies document apparent instances of delayed-onset…
Morning blood pressure surge: pathophysiology, clinical relevance and therapeutic aspects
Bilo, Grzegorz; Grillo, Andrea; Guida, Valentina; Parati, Gianfranco
2018-01-01
Morning hours are the period of the day characterized by the highest incidence of major cardiovascular events including myocardial infarction, sudden death or stroke. They are also characterized by important neurohormonal changes, in particular, the activation of sympathetic nervous system which usually leads to a rapid increase in blood pressure (BP), known as morning blood pressure surge (MBPS). It was hypothesized that excessive MBPS may be causally involved in the pathogenesis of cardiovascular events occurring in the morning by inducing hemodynamic stress. A number of studies support an independent relationship of MBPS with organ damage, cerebrovascular complications and mortality, although some heterogeneity exists in the available evidence. This may be due to ethnic differences, methodological issues and the confounding relationship of MBPS with other features of 24-hour BP profile, such as nocturnal dipping or BP variability. Several studies are also available dealing with treatment effects on MBPS and indicating the importance of long-acting antihypertensive drugs in this regard. This paper provides an overview of pathophysiologic, methodological, prognostic and therapeutic aspects related to MBPS. PMID:29872338
Detecting Biosphere anomalies hotspots
NASA Astrophysics Data System (ADS)
Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim
2017-04-01
The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint distribution. The proposed methodology has been applied to different areas around the globe. The results show that the method is able to detect historic events and also provides a useful tool to define sensitive regions. This method and results have been developed within the framework of the project BACI (http://baci-h2020.eu/), which aims to integrate Earth Observation data to monitor the earth system and assessing the impacts of terrestrial changes. [1] V. Chandola, A., Banerjee and v., Kumar. Anomaly detection: a survey. ACM computing surveys (CSUR), vol. 41, n. 3, 2009. [2] P. Mahalanobis. On the generalised distance in statistics. Proceedings National Institute of Science, vol. 2, pp 49-55, 1936.
Characterization of Single-Event Burnout in Power MOSFET Using Backside Laser Testing
NASA Astrophysics Data System (ADS)
Miller, F.; Luu, A.; Prud'homme, F.; Poirot, P.; Gaillard, R.; Buard, N.; Carrire, T.
2006-12-01
This paper presents a new methodology based upon backside laser irradiations to characterize the sensitivity of power devices towards Single-Event Burnout. It is shown that this technique can be used to define the safe operating area
Lenton-Brym, Ariella; Kurczek, Jake; Rosenbaum, R Shayna; Sheldon, Signy
2016-05-01
Constructing autobiographical events involves an initial phase of event selection, in which a memory or imagined future event is initially brought to mind, followed by a phase of elaboration, in which an individual accesses detailed knowledge specific to the event. While considerable research demonstrates the importance of the medial temporal lobes (MTL) in the later phase, its role in initial event selection is unknown. The present study is the first to investigate the role of the MTL in event selection by assessing whether individuals with MTL lesions select qualitatively different events for remembering and imagining than matched control participants. To do so, we created "event captions" that reflected the type of events selected for an autobiographical event narrative task by four individuals with MTL amnesia and control counterparts. Over 450 online raters assessed these event captions on qualitative dimensions known to vary with autobiographical recall (frequency, significance, emotionality, imageability, and uniqueness). Our critical finding was that individuals with MTL amnesia were more prone to select events that were rated as more frequently occurring than healthy control participants. We interpret this finding as evidence that people with impaired episodic memory from MTL damage compensate for their compromised ability to recall detailed information by relying more heavily on semantic memory processes to select generalized events. We discuss the implications for theoretical models of memory and methodological approaches to studying autobiographical memory. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Saddhono, Kundharu; Rohmadi, Muhammad
2014-01-01
This study aims at describing the use of language at primary schools grade 1, 2, and 3 in Surakarta. The study belongs to descriptive qualitative research. It emphasizes in a note which depict real situation to support data presentation. Content analysis is used as research methodology. It analyzes the research result of the observed speech event.…
A systematic review of health effects of electronic cigarettes.
Pisinger, Charlotta; Døssing, Martin
2014-12-01
To provide a systematic review of the existing literature on health consequences of vaporing of electronic cigarettes (ECs). Search in: PubMed, EMBASE and CINAHL. Original publications describing a health-related topic, published before 14 August 2014. PRISMA recommendations were followed. We identified 1101 studies; 271 relevant after screening; 94 eligible. We included 76 studies investigating content of fluid/vapor of ECs, reports on adverse events and human and animal experimental studies. Serious methodological problems were identified. In 34% of the articles the authors had a conflict of interest. Studies found fine/ultrafine particles, harmful metals, carcinogenic tobacco-specific nitrosamines, volatile organic compounds, carcinogenic carbonyls (some in high but most in low/trace concentrations), cytotoxicity and changed gene expression. Of special concern are compounds not found in conventional cigarettes, e.g. propylene glycol. Experimental studies found increased airway resistance after short-term exposure. Reports on short-term adverse events were often flawed by selection bias. Due to many methodological problems, severe conflicts of interest, the relatively few and often small studies, the inconsistencies and contradictions in results, and the lack of long-term follow-up no firm conclusions can be drawn on the safety of ECs. However, they can hardly be considered harmless. Copyright © 2014. Published by Elsevier Inc.
Regression analysis of mixed recurrent-event and panel-count data with additive rate models.
Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L
2015-03-01
Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.
Synchronization of autonomous objects in discrete event simulation
NASA Technical Reports Server (NTRS)
Rogers, Ralph V.
1990-01-01
Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.
Hernández, Mauricio; Arboleda, Diana; Arce, Stephania; Benavides, Allan; Tejada, Paola Andrea; Ramírez, Sindy Vanessa; Cubides, Ángela
2015-12-07
Dengue is the fastest spreading disease in the world and a permanent threat to global public health. It is a viral illness for which approximately 2.5 million people are at high risk of infection. Given the severity of the disease at national and global levels, new predictive methodologies need to be generated to facilitate decision-making in public health. To characterize cases of dengue reported from 2009 to 2013 in Valle del Cauca department, Colombia, and to establish a methodology to develop endemic channels that can be applied to this event. This was a retrospective descriptive study. Notification forms were used as a secondary database to characterize dengue cases from 2009 to 2013. Two endemic channels were developed, one using running means and the other through exponential smoothing. Dengue in the department of Valle del Cauca showed a positive tendency, indicating that the number of cases had increased in the last five years. An important variation was observed that could be explained by a three-year cycle beginning in the first epidemiological period of the year. The development of the dengue endemic channel for Valle del Cauca illustrates the importance of applying these monitoring methodologies to events of public health interest. As can be seen from the results, there were some years in which the number of cases was very low and others in which the epidemic reached very high levels.
2017-09-01
the SETR entrance criteria of these events. Out of 30 evaluated SETR entrance criteria, 22 map to FAA elements. A case study of a military CDA...evaluated SETR entrance criteria, 22 map to FAA elements. A case study of a military CDA program, the Presidential Helicopter Replacement Program...3 C. SCOPE AND METHODOLOGY .................................................. 4 D. ORGANIZATION OF THESIS
Organizing Chaos: The Tactical Assault Kit Collaborative Mission Planner
2018-12-01
choice. Case studies , such as the 2017 Presidential Inauguration Collective Security Event, Operation Flaming Sword 2017, and the counter-ISIS campaign...rallied around the Tactical Assault Kit (TAK) as their mission command tool of choice. Case studies , such as the 2017 Presidential Inauguration...authorities ADA Air Defense Artillery ADM Army Design Methodology ADAPT Advanced Digital Advisor Partner Technologies ATAK Android Tactical Assault Kit
NASA Astrophysics Data System (ADS)
Boschetti, Laurie; Provitolo, Damienne; Tric, Emmanuel
2017-04-01
Climate change and major crisis have been increasing during the 21st century, which have impacted people and helped them to realize that they had to protect themselves against it. That is why scientists, practitioners and institutions are more and more exploring resilience concept and methodology (Dauphiné, Provitolo, 2013). Resilience came at first from material physics, and is now developed in different disciplines, like psychology, ecology, economy, more recently geography, and more specifically natural risk analysis. The downside of this multidisciplinary interest is that this concept became a polysemous concept, resulting on the difficulty, for the scientific community, to agree about a single definition to characterize it (Reghezza et al. 2015). Our presentation will propose a resilience analysis model of a territory subject to naturel hazard, after which, this methodology will be demonstrating to a specific territory, the French Riviera, more precisely the Alpes-Maritimes. We choose, as natural hazard to realize our study, the tsunami which could impact the Alpes-Maritimes coast. This choice has been done for many reasons: - This natural hazard is almost not included in the different studies and french official documents, whereas the risk is real in Mediterranean. Two significant events had happened in our study area: the first one in 1887, following the Ligurian earthquake (Ioualalen et al. 2010); the second one in 1979, off Nice airport, produced by a submarine landslide (Migeon, 2011b). Those events present a crucial particularity, being near the source, the arrival time is quiet short, making any planed escape impossible. We can describe them as flash risks. - The study area, containing coastal cities of the Alpes-Maritimes, presents many key issues, humans and economics. - This region has a specific geography, including a territory which has been developed between sea and mountains, a high density in the coast, and an anisotropy of the networks (infrastructure, communication, etc.). Yet we know how essential it is to maintain network in the recovery after disasters. (Lhomme et al. 2010). For this purpose, we relied on the resilience analyst method suggested by the scientific group Resilience Alliance (2010), who came from the human and social sciences. This methodology caught our interest, because it appeared to have a systemic approach, and allowed to include temporal dimension of an event (prevention and crisis management). However, this model presented some limits when we translated it in the field of risks and disasters. In order to create a model fully functional in this domain, we suggested to bring some changes. This new methodology not only allowed to provide an evaluation grid to the territory and the population reactions to an event, but also to determine preventive strategies (ante-catastrophe) and after disaster recovery strategies (post-catastrophe) that could be used.
McMullen, Kathleen M; Boyer, Anthony F; Schoenberg, Noah; Babcock, Hilary M; Micek, Scott T; Kollef, Marin H
2015-06-01
The National Healthcare Safety Network (NHSN) has recently supported efforts to shift surveillance away from ventilator-associated pneumonia to ventilator-associated events (VAEs) to decrease subjectivity in surveillance and minimize concerns over clinical correlation. The goals of this study were to compare the results of an automated surveillance strategy using the new VAE definition with a prospectively performed clinical application of the definition. All patients ventilated for ≥2 days in a medical and surgical intensive care unit were evaluated by 2 methods: retrospective surveillance using an automated algorithm combined with manual chart review after the NHSN's VAE methodology and prospective surveillance by pulmonary physicians in collaboration with the clinical team administering care to the patient at the bedside. Overall, a similar number of events were called by each method (69 vs 67). Of the 1,209 patients, 56 were determined to have VAEs by both methods (κ = .81, P = .04). There were 24 patients considered to be a VAE by only 1 of the methods. Most discrepancies were the result of clinical disagreement with the NHSN's VAE methodology. There was good agreement between the study teams. Awareness of the limitations of the surveillance definition for VAE can help infection prevention personnel in discussions with critical care partners about optimal use of these data. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Toward a Methodology of Death: Deleuze's "Event" as Method for Critical Ethnography
ERIC Educational Resources Information Center
Rodriguez, Sophia
2016-01-01
This article examines how qualitative researchers, specifically ethnographers, might utilize complex philosophical concepts in order to disrupt the normative truth-telling practices embedded in social science research. Drawing on my own research experiences, I move toward a methodology of death (for researcher/researched alike) grounded in…
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
Structural health monitoring methodology for aircraft condition-based maintenance
NASA Astrophysics Data System (ADS)
Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre
2001-06-01
Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.
Impact of the FTSE4Good Index on firm price: an event study.
Martin Curran, M; Moran, Dominic
2007-03-01
This paper examines whether corporate financial performance is affected by public endorsement of environmental and social performance. Event study methodology, which relies on the notion of market efficiency, is used to examine the relationship between positive and negative announcements and changes in share prices or daily returns. Inclusion in and deletion from the FTSE4Good UK Index is used as a proxy measure for good (poor) corporate social responsibility. The abnormal or unexpected daily returns associated with an event are calculated and their significance tested. The results show a trend towards positive and negative announcements having the expected effects on daily returns. But these movements are not significant and the data do not suggest that a firm's presence on the index brings it any significant financial return for signalling its corporate social responsibility.
Use of a trigger tool to detect adverse drug reactions in an emergency department.
de Almeida, Silvana Maria; Romualdo, Aruana; de Abreu Ferraresi, Andressa; Zelezoglo, Giovana Roberta; Marra, Alexandre R; Edmond, Michael B
2017-11-15
Although there are systems for reporting adverse drug reactions (ADR), these safety events remain under reported. The low-cost, low-tech trigger tool method is based on the detection of events through clues, and it seems to increase the detection of adverse events compared to traditional methodologies. This study seeks to estimate the prevalence of adverse reactions to drugs in patients seeking care in the emergency department. Retrospective study from January to December, 2014, applying the Institute for Healthcare Improvement (IHI) trigger tool methodology for patients treated at the emergency room of a tertiary care hospital. The estimated prevalence of adverse reactions in patients presenting to the emergency department was 2.3% [CI 95 1.3% to 3.3%]; 28.6% of cases required hospitalization at an average cost of US$ 5698.44. The most common triggers were hydrocortisone (57% of the cases), diphenhydramine (14%) and fexofenadine (14%). Anti-infectives (19%), cardiovascular agents (14%), and musculoskeletal drugs (14%) were the most common causes of adverse reactions. According to the Naranjo Scale, 71% were classified as possible and 29% as probable. There was no association between adverse reactions and age and sex in the present study. The use of the trigger tool to identify adverse reactions in the emergency department was possible to identify a prevalence of 2.3%. It showed to be a viable method that can provide a better understanding of adverse drug reactions in this patient population.
Assessment of precursory information in seismo-electromagnetic phenomena
NASA Astrophysics Data System (ADS)
Han, P.; Hattori, K.; Zhuang, J.
2017-12-01
Previous statistical studies showed that there were correlations between seismo-electromagnetic phenomena and sizeable earthquakes in Japan. In this study, utilizing Molchan's error diagram, we evaluate whether these phenomena contain precursory information and discuss how they can be used in short-term forecasting of large earthquake events. In practice, for given series of precursory signals and related earthquake events, each prediction strategy is characterized by the leading time of alarms, the length of alarm window, the alarm radius (area) and magnitude. The leading time is the time length between a detected anomaly and its following alarm, and the alarm window is the duration that an alarm lasts. The alarm radius and magnitude are maximum predictable distance and minimum predictable magnitude of earthquake events, respectively. We introduce the modified probability gain (PG') and the probability difference (D') to quantify the forecasting performance and to explore the optimal prediction parameters for a given electromagnetic observation. The above methodology is firstly applied to ULF magnetic data and GPS-TEC data. The results show that the earthquake predictions based on electromagnetic anomalies are significantly better than random guesses, indicating the data contain potential useful precursory information. Meanwhile, we reveal the optimal prediction parameters for both observations. The methodology proposed in this study could be also applied to other pre-earthquake phenomena to find out whether there is precursory information, and then on this base explore the optimal alarm parameters in practical short-term forecast.
NASA Technical Reports Server (NTRS)
Mehr, Ali Farhang; Sauvageon, Julien; Agogino, Alice M.; Tumer, Irem Y.
2006-01-01
Recent advances in micro electromechanical systems technology, digital electronics, and wireless communications have enabled development of low-cost, low-power, multifunctional miniature smart sensors. These sensors can be deployed throughout a region in an aerospace vehicle to build a network for measurement, detection and surveillance applications. Event detection using such centralized sensor networks is often regarded as one of the most promising health management technologies in aerospace applications where timely detection of local anomalies has a great impact on the safety of the mission. In this paper, we propose to conduct a qualitative comparison of several local event detection algorithms for centralized redundant sensor networks. The algorithms are compared with respect to their ability to locate and evaluate an event in the presence of noise and sensor failures for various node geometries and densities.
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
Deep Borehole Emplacement Mode Hazard Analysis Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David
This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less
Sazonov, Edward S; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L; Neuman, Michael R
2010-03-01
Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for monitoring of ingestive behavior (MIB) in the free-living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech, and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition, and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 h with a total of 9966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch-recognition accuracy for intravisit individual models was 96.8%, which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals.
Sazonov, Edward S.; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L.; Neuman, Michael R.
2010-01-01
Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for Monitoring of Ingestive Behavior (MIB) in the free living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 hours with a total of 9,966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch recognition accuracy for intra-visit individual models was 96.8% which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals. PMID:19789095
Classification of rainfall events for weather forecasting purposes in andean region of Colombia
NASA Astrophysics Data System (ADS)
Suárez Hincapié, Joan Nathalie; Romo Melo, Liliana; Vélez Upegui, Jorge Julian; Chang, Philippe
2016-04-01
This work presents a comparative analysis of the results of applying different methodologies for the identification and classification of rainfall events of different duration in meteorological records of the Colombian Andean region. In this study the work area is the urban and rural area of Manizales that counts with a monitoring hydro-meteorological network. This network is composed of forty-five (45) strategically located stations, this network is composed of forty-five (45) strategically located stations where automatic weather stations record seven climate variables: air temperature, relative humidity, wind speed and direction, rainfall, solar radiation and barometric pressure. All this information is sent wirelessly every five (5) minutes to a data warehouse located at the Institute of Environmental Studies-IDEA. With obtaining the series of rainfall recorded by the hydrometeorological station Palogrande operated by the National University of Colombia in Manizales (http://froac.manizales.unal.edu.co/bodegaIdea/); it is with this information that we proceed to perform behavior analysis of other meteorological variables, monitored at surface level and that influence the occurrence of such rainfall events. To classify rainfall events different methodologies were used: The first according to Monjo (2009) where the index n of the heavy rainfall was calculated through which various types of precipitation are defined according to the intensity variability. A second methodology that permitted to produce a classification in terms of a parameter β introduced by Rice and Holmberg (1973) and adapted by Llasat and Puigcerver, (1985, 1997) and the last one where a rainfall classification is performed according to the value of its intensity following the issues raised by Linsley (1977) where the rains can be considered light, moderate and strong fall rates to 2.5 mm / h; from 2.5 to 7.6 mm / h and above this value respectively for the previous classifications. The main contribution which is done with this research is the obtainment elements to optimize and to improve the spatial resolution of the results obtained with mesoscale models such as the Weather Research & Forecasting Model- WRF, used in Colombia for the purposes of weather forecasting and that in addition produces other tools used in current issues such as risk management.
Maley, Christine M; Pagana, Nicole K; Velenger, Christa A; Humbert, Tamera Keiter
2016-01-01
This systematic literature review analyzed the construct of spirituality as perceived by people who have experienced or are experiencing a major life event or transition. The researchers investigated studies that used narrative analysis or a phenomenological methodology related to the topic. Thematic analysis resulted in three major themes: (1) avenues to and through spirituality, (2) the experience of spirituality, and (3) the meaning of spirituality. The results provide insights into the intersection of spirituality, meaning, and occupational engagement as understood by people experiencing a major life event or transition and suggest further research that addresses spirituality in occupational therapy and interdisciplinary intervention. Copyright © 2016 by the American Occupational Therapy Association, Inc.
NASA Astrophysics Data System (ADS)
Diffenbaugh, N. S.
2017-12-01
Severe heat provides one of the most direct, acute, and rapidly changing impacts of climate on people and ecostystems. Theory, historical observations, and climate model simulations all suggest that global warming should increase the probability of hot events that fall outside of our historical experience. Given the acutre impacts of extreme heat, quantifying the probability of historically unprecedented hot events at different levels of climate forcing is critical for climate adaptation and mitigation decisions. However, in practice that quantification presents a number of methodological challenges. This presentation will review those methodological challenges, including the limitations of the observational record and of climate model fidelity. The presentation will detail a comprehensive approach to addressing these challenges. It will then demonstrate the application of that approach to quantifying uncertainty in the probability of record-setting hot events in the current climate, as well as periods with lower and higher greenhouse gas concentrations than the present.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Implementation and adaptation of a macro-scale methodology to calculate direct economic losses
NASA Astrophysics Data System (ADS)
Natho, Stephanie; Thieken, Annegret
2017-04-01
As one of the 195 member countries of the United Nations, Germany signed the Sendai Framework for Disaster Risk Reduction 2015-2030 (SFDRR). With this, though voluntary and non-binding, Germany agreed to report on achievements to reduce disaster impacts. Among other targets, the SFDRR aims at reducing direct economic losses in relation to the global gross domestic product by 2030 - but how to measure this without a standardized approach? The United Nations Office for Disaster Risk Reduction (UNISDR) has hence proposed a methodology to estimate direct economic losses per event and country on the basis of the number of damaged or destroyed items in different sectors. The method bases on experiences from developing countries. However, its applicability in industrial countries has not been investigated so far. Therefore, this study presents the first implementation of this approach in Germany to test its applicability for the costliest natural hazards and suggests adaptations. The approach proposed by UNISDR considers assets in the sectors agriculture, industry, commerce, housing, and infrastructure by considering roads, medical and educational facilities. The asset values are estimated on the basis of sector and event specific number of affected items, sector specific mean sizes per item, their standardized construction costs per square meter and a loss ratio of 25%. The methodology was tested for the three costliest natural hazard types in Germany, i.e. floods, storms and hail storms, considering 13 case studies on the federal or state scale between 1984 and 2016. Not any complete calculation of all sectors necessary to describe the total direct economic loss was possible due to incomplete documentation. Therefore, the method was tested sector-wise. Three new modules were developed to better adapt this methodology to German conditions covering private transport (cars), forestry and paved roads. Unpaved roads in contrast were integrated into the agricultural and forestry sector. Furthermore overheads are proposed to include costs of housing content as well as the overall costs of public infrastructure, one of the most important damage sectors. All constants considering sector specific mean sizes or construction costs were adapted. Loss ratios were adapted for each event. Whereas the original UNISDR method over- und underestimates the losses of the tested events, the adapted method is able to calculate losses in good accordance for river floods, hail storms and storms. For example, for the 2013-flood economic losses of EUR 6.3 billion were calculated (UNISDR EUR 0.85 billion, documentation EUR 11 billion). For the hail storms in 2013 the calculated EUR 3.6 billion overestimate the documented losses of EUR 2.7 billion less than the original UNISDR approach with EUR 5.2 billion. Only for flash floods, where public infrastructure can account for more than 90% of total losses, the method is absolutely not applicable. The adapted methodology serves as a good starting point for macro-scale loss estimations by accounting for the most important damage sectors. By implementing this approach into damage and event documentation and reporting standards, a consistent monitoring according to the SFDRR could be achieved.
ERIC Educational Resources Information Center
Rickard, Andrew
2006-01-01
Event tourism is accompanied by social, economic and environmental benefits and costs. The assessment of this form of tourism has however largely focused on the social and economic perspectives, while environmental assessments have been bound to a destination-based approach. The application of the Ecological Footprint methodology allows for these…
ERIC Educational Resources Information Center
Burton, Emily; Stice, Eric; Seeley, John R.
2004-01-01
The stress-buffering model posits that social support mitigates the relation between negative life events and onset of depression, but prospective studies have provided little support for this assertion. The authors sought to provide a more sensitive test of this model by addressing certain methodological and statistical limitations of past…
Satellite services system analysis study. Volume 2: Satellite and services user model
NASA Technical Reports Server (NTRS)
1981-01-01
Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.
The Ticking of the Social Clock: Adults' Beliefs about the Timing of Transition Events.
ERIC Educational Resources Information Center
Peterson, Candida C.
1996-01-01
Two studies regarding beliefs about descriptive and prescriptive age norms for adults in developmental transitions were examined in a sample of 214 Australian university students ages 17 to 50. Discusses research methodology. The probable consequences for self-esteem, mental health, and life planning are discussed in the context of the research…
ERIC Educational Resources Information Center
Gaies, Stephen J.
Aims of classroom-centered research on second language learning and teaching are considered and contrasted with the experimental approach. Attention is briefly directed to methodological problems of experiments, such as controlling classroom events in various ways, and to conceptual weaknesses with study variables. In contrast, classroom-centered…
2015-03-12
26 Table 3: Optometry Clinic Frequency Count... Optometry Clinic Frequency Count.................................................................. 86 Table 22: Probability Distribution Summary Table...Clinic, the Audiology Clinic, and the Optometry Clinic. Methodology Overview The overarching research goal is to identify feasible solutions to
Wu, Cai; Li, Liang
2018-05-15
This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.
FUSSELL, ELIZABETH; CURRAN, SARA R.; DUNBAR, MATTHEW D.; BABB, MICHAEL A.; THOMPSON, LUANNE; MEIJER-IRONS, JACQUELINE
2017-01-01
Environmental determinists predict that people move away from places experiencing frequent weather hazards, yet some of these areas have rapidly growing populations. This analysis examines the relationship between weather events and population change in all U.S. counties that experienced hurricanes and tropical storms between 1980 and 2012. Our database allows for more generalizable conclusions by accounting for heterogeneity in current and past hurricane events and losses and past population trends. We find that hurricanes and tropical storms affect future population growth only in counties with growing, high-density populations, which are only 2 percent of all counties. In those counties, current year hurricane events and related losses suppress future population growth, although cumulative hurricane-related losses actually elevate population growth. Low-density counties and counties with stable or declining populations experience no effect of these weather events. Our analysis provides a methodologically informed explanation for contradictory findings in prior studies. PMID:29326480
Fussell, Elizabeth; Curran, Sara R; Dunbar, Matthew D; Babb, Michael A; Thompson, Luanne; Meijer-Irons, Jacqueline
2017-01-01
Environmental determinists predict that people move away from places experiencing frequent weather hazards, yet some of these areas have rapidly growing populations. This analysis examines the relationship between weather events and population change in all U.S. counties that experienced hurricanes and tropical storms between 1980 and 2012. Our database allows for more generalizable conclusions by accounting for heterogeneity in current and past hurricane events and losses and past population trends. We find that hurricanes and tropical storms affect future population growth only in counties with growing, high-density populations, which are only 2 percent of all counties. In those counties, current year hurricane events and related losses suppress future population growth, although cumulative hurricane-related losses actually elevate population growth. Low-density counties and counties with stable or declining populations experience no effect of these weather events. Our analysis provides a methodologically informed explanation for contradictory findings in prior studies.
NASA Technical Reports Server (NTRS)
Shooman, Martin L.
1994-01-01
This report presents the methodology and results of a subjective study done by Polytechnic University to investigate Electromagnetic Interference (EMI) events on aircraft. The results cover various types of EMI from on-board aircraft systems, passenger carry-on devices, and externally generated disturbances. The focus of the study, however, was on externally generated EMI, termed High Intensity Radiated Fields (HIRF), from radars, radio and television transmitters, and other man-made emitters of electromagnetic energy. The study methodology used an anonymous questionnaire distributed to experts to gather the data. This method is known as the Delphi or Consensus Estimation technique. The questionnaire was sent to an expert population of 230 and there were 57 respondents. Details of the questionnaire, a few anecdotes, and the statistical results of the study are presented.
Media Effects in Youth Exposed to Terrorist Incidents: a Historical Perspective.
Pfefferbaum, Betty; Tucker, Phebe; Pfefferbaum, Rose L; Nelson, Summer D; Nitiéma, Pascal; Newman, Elana
2018-03-05
This paper reviews the evidence on the relationship between contact with media coverage of terrorist incidents and psychological outcomes in children and adolescents while tracing the evolution in research methodology. Studies of recent events in the USA have moved from correlational cross-sectional studies examining primarily television coverage and posttraumatic stress reactions to longitudinal studies that address multiple media forms and a range of psychological outcomes including depression and anxiety. Studies of events in the USA-the 1995 Oklahoma City bombing, the September 11 attacks, and the 2013 Boston Marathon bombing-and elsewhere have used increasingly sophisticated research methods to document a relationship between contact with various media forms and adverse psychological outcomes in children with different event exposures. Although adverse outcomes are associated with reports of greater contact with terrorism coverage in cross-sectional studies, there is insufficient evidence at this time to assume a causal relationship. Additional research is needed to investigate a host of issues such as newer media forms, high-risk populations, and contextual factors.
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.
2016-04-01
Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its meteorological drivers, and how it can be expected to change in the future. Finally, we assess the applicability of this methodology to other regions. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
Logic flowgraph methodology - A tool for modeling embedded systems
NASA Technical Reports Server (NTRS)
Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.
1991-01-01
The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.
Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.
2015-12-01
Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976
Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan
2017-01-01
In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.
2017-01-05
module. 15. SUBJECT TERMS Logistics, attrition, discrete event simulation, Simkit, LBC 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION...stochastics, and discrete event model programmed in Java building largely on the Simkit library. The primary purpose of the LBC model is to support...equations makes them incompatible with the discrete event construct of LBC. Bullard further advances this methodology by developing a stochastic
NASA Astrophysics Data System (ADS)
Sperotto, Anna; Torresan, Silvia; Gallina, Valentina; Coppola, Erika; Critto, Andrea; Marcomini, Antonio
2015-04-01
Global climate change is expected to affect the intensity and frequency of extreme events (e.g. heat waves, drought, heavy precipitations events) leading to increasing natural disasters and damaging events (e.g. storms, pluvial floods and coastal flooding) worldwide. Especially in urban areas, disasters risks can be exacerbated by changes in exposure and vulnerability patterns (i.e. urbanization, population growth) and should be addressed by adopting a multi-disciplinary approach. A Regional Risk Assessment (RRA) methodology integrating climate and environmental sciences with bottom-up participative processes was developed and applied to the urban territory of the municipality of Venice in order to evaluate the potential consequences of climate change on pluvial flood risk in urban areas. Based on the consecutive analysis of hazard, exposure, vulnerability and risks, the RRA methodology is a screening risk tool to identify and prioritize major elements at risk (e.g. residential, commercial areas and infrastructures) and to localize sub-areas that are more likely to be affected by flood risk due to heavy precipitation events, in the future scenario (2041-2050). From the early stages of its development and application, the RRA followed a bottom-up approach to select and score site-specific vulnerability factors (e.g. slope, permeability of the soil, past flooded areas) and to consider the requests and perspectives of local stakeholders of the North Adriatic region, by means of interactive workshops, surveys and discussions. The main outputs of the assessment are risk and vulnerability maps and statistics aimed at increasing awareness about the potential effect of climate change on pluvial flood risks and at identifying hot-spot areas where future adaptation actions should be required to decrease physical-environmental vulnerabilities or building resilience and coping capacity of human society to climate change. The overall risk assessment methodology and the results of its application to the territory of the municipality of Venice will be here presented and discussed.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
The interaction of antibodies with lipid membranes unraveled by fluorescence methodologies
NASA Astrophysics Data System (ADS)
Figueira, Tiago N.; Veiga, Ana Salomé; Castanho, Miguel A. R. B.
2014-12-01
The interest and investment in antibody therapies has reached an overwhelming scale in the last decade. Yet, little concern has been noticed among the scientific community to unravel important interactions of antibodies with biological structures other than their respective epitopes. Lipid membranes are particularly relevant in this regard as they set the stage for protein-protein recognition, a concept potentially inclusive of antibody-antigen recognition. Fluorescence techniques allow experimental monitoring of protein partition between aqueous and lipid phases, deciphering events of adsorption, insertion and diffusion. This review focuses on the available fluorescence spectroscopy methodologies directed to the study of antibody-membrane interactions.
Exploring unplanned ICU admissions: a systematic review.
Vlayen, Annemie; Verelst, Sandra; Bekkering, Geertruida E; Schrooten, Ward; Hellings, Johan; Claes, Nerée
Adverse events are unintended patient injuries or complications that arise from healthcare management resulting in death, disability or prolonged hospital stay. Adverse events that require critical care are a considerable financial burden to the healthcare system. Medical record review seems to be a reliable method for detecting adverse events. To synthesize the best available evidence regarding the estimates of the incidence and preventability of adverse events that necessitate intensive care admission; to determine the type and consequences (patient harm, mortality, length of ICU stay and direct medical costs) of these adverse events. MEDLINE (from 1966 to present), EMBASE (from 1974 to present) and CENTRAL (version 1-2010) were searched for studies reporting on unplanned admissions to intensive care units (ICUs). Databases of reports, conference proceedings, grey literature, ongoing research, relevant patient safety organizations and two journals were searched for additional studies. Reference lists of retrieved papers were searched and authors were contacted in an attempt to find any further published or unpublished work. Only quantitative studies that used chart review for the detection of adverse events requiring intensive care admission were considered for eligibility. Studies that were published in the English, Dutch, German, French or Spanish language were included. Two reviewers independently extracted data and assessed the methodological quality of the included studies. 28 studies in the English language and one study in French were included. Of these, two were considered duplicate publications and therefore 27 studies were reviewed. Meta-analysis of the data was not appropriate due to statistical heterogeneity between studies; therefore, results are presented in a descriptive way. Studies were categorized according to the population and the providers of care. 1) The majority of the included studies investigated unplanned intensive care admissions after anesthetic procedures (UIA). 2) Only a few studies examined patients on general wards being at risk for clinical deterioration. The overall incidence of surgical and medical adverse events compared with ICU admissions ranged from 1.1% to 37.2%. 3) The third category of studies examined patients that were readmitted on ICUs. ICU readmission rates varied from 0% to 18.3%. Nine studies explicitly reported on the preventability of adverse outcomes. The preventability rates of the adverse events varied from 17% to 76.5%. Preventable adverse events are further synthesized by type of event and patterns of preventability are being formulated. Consequences of the adverse events included a mean length of ICU stay that ranged from 1.5 days to 10.4 days for the patient's first stay in ICU. Mortality rates varied between 0% and 58%. Adverse events are a persistent and an important reason for admission to the ICU. However, there is relatively weak evidence to estimate an overall incidence and preventability rate of these events. In addition, estimates on preventability are prone to subjective judgments. Variability in methodology and definitions, and poor reporting in studies may be the main reasons for study heterogeneity. Unplanned intensive care admission within 24 hours of a procedure with an anesthetist in attendance (UIA) is a recommended clinical indicator in surgical patients. Several authors recommend early detection of patients with clinical instability on general wards and the implementation of rapid response teams. Step-down or intermediate care units could be a useful strategy for patients that require monitoring to avoid ICU readmissions. There is a need for further studies on the detection of adverse events. The poor quality of current research evidence and the heterogeneity across studies requires that planning of future studies should aim to standardize measures of outcomes to allow for comparisons across studies. This area of research is important in order to identify and explain failure of healthcare systems leading to patient harm, with the ultimate aim to improve the quality and safety of care.
Artani, Azmina; Bhamani, Shireen Shehzad; Azam, Iqbal; AbdulSultan, Moiz; Khoja, Adeel; Kamal, Ayeesha K
2017-05-05
Contextually relevant stressful life events are integral to the quantification of stress. None such measures have been adapted for the Pakistani population. The RLCQ developed by Richard Rahe measures stress of an individual through recording the experience of life changing events. We used qualitative methodology in order to identify contextually relevant stressors in an open ended format, using serial in-depth interviews until thematic saturation of reported stressful life events was achieved. In our next phase of adaptation, our objective was to scale each item on the questionnaire, so as to weigh each of these identified events, in terms of severity of stress. This scaling exercise was performed on 200 random participants residing in the four communities of Karachi namely Kharadar, Dhorajee, Gulshan and Garden. For analysis of the scaled tool, exploratory factor analysis was used to inform structuring. Finally, to complete the process of adaption, content and face validity exercises were performed. Content validity by subject expert review and face validity was performed by translation and back translation of the adapted RLCQ. This yielded our final adapted tool. Stressful life events emerging from the qualitative phase of the study reflect daily life stressors arising from the unstable socio-political environment. Some such events were public harassment, robbery/theft, missed life opportunities due to nepotism, extortion and threats, being a victim of state sponsored brutality, lack of electricity, water, sanitation, fuel, destruction due to natural disasters and direct or media based exposure to suicide bombing in the city. Personal or societal based relevant stressors included male child preference, having an unmarried middle aged daughter, lack of empowerment and respect reported by females. The finally adapted RLCQ incorporated "Environmental Stress" as a new category. The processes of qualitative methodology, in depth interview, community based scaling and face and content validity yielded an adapted RLCQ that represents contextually relevant life stress for adults residing in urban Pakistan. Clinicaltrials.gov NCT02356263 . Registered January 28, 2015. (Observational Study Only).
Not the last word: dissemination strategies for patient-centred research in nursing
Hagan, Teresa L.; Schmidt, Karen; Ackison, Guyanna R.; Murphy, Megan; Jones, Jennifer R.
2017-01-01
Introduction Research results hold value for many stakeholders including researchers, patient populations, advocacy organizations, and community groups. The aim of this study is to describe our research team’s systematic process to designing a dissemination strategy for a completed research study. Methodology We organized a dissemination event to feed the results of our study to participants and stakeholders and collect feedback regarding our study. We applied the Agency for Healthcare Research and Quality’s dissemination framework to guide the development of the event and collected participant feedback during the event. Results We describe our dissemination strategy along with attendees’ feedback and suggestions for our research as an example of a way to design a patient- and community-focused dissemination. We explain the details of our dissemination strategy including (a) our process of reporting a large research study into a stakeholder event, (b) stakeholder feedback collected at the event, and (c) the translation of feedback into our research team’s research. We also describe challenges encountered during the dissemination process and ways to handle issues such as logistics, funding, and staff. Conclusions This analysis provides key insights and practical advice for researchers looking for innovative ways to disseminate their findings within the lay and scientific communities. PMID:29081824
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
Using Rapid-Response Scenario-Building Methodology for Climate Change Adaptation Planning
NASA Astrophysics Data System (ADS)
Ludwig, K. A.; Stoepler, T. M.; Schuster, R.
2015-12-01
Rapid-response scenario-building methodology can be modified to develop scenarios for slow-onset disasters associated with climate change such as drought. Results of a collaboration between the Department of the Interior (DOI) Strategic Sciences Group (SSG) and the Southwest Colorado Social-Ecological Climate Resilience Project are presented in which SSG scenario-building methods were revised and applied to climate change adaptation planning in Colorado's Gunnison Basin, United States. The SSG provides the DOI with the capacity to rapidly assemble multidisciplinary teams of experts to develop scenarios of the potential environmental, social, and economic cascading consequences of environmental crises, and to analyze these chains to determine actionable intervention points. By design, the SSG responds to acute events of a relatively defined duration. As a capacity-building exercise, the SSG explored how its scenario-building methodology could be applied to outlining the cascading consequences of slow-onset events related to climate change. SSG staff facilitated two workshops to analyze the impacts of drought, wildfire, and insect outbreak in the sagebrush and spruce-fir ecosystems. Participants included local land managers, natural and social scientists, ranchers, and other stakeholders. Key findings were: 1) scenario framing must be adjusted to accommodate the multiple, synergistic components and longer time frames of slow-onset events; 2) the development of slow-onset event scenarios is likely influenced by participants having had more time to consider potential consequences, relative to acute events; 3) participants who are from the affected area may have a more vested interest in the outcome and/or may be able to directly implement interventions.
An archival study of eyewitness memory of the Titanic's final plunge.
Riniolo, Todd C; Koledin, Myriah; Drakulic, Gregory M; Payne, Robin A
2003-01-01
A handful of real-life studies demonstrate that most eyewitnesses accurately recall central details (i.e., the gist of what happened) from traumatic events. The authors evaluated the accuracy of archival eyewitness testimony from survivors of the Titanic disaster who witnessed the ship's final plunge. The results indicate that most eyewitness testimony (15 eyewitnesses of 20) is consistent with forensic evidence that demonstrates that the Titanic was breaking apart while it was still on the ocean's surface. Despite the methodological limitations of archival research, the authors provide evidence from a single-occurrence traumatic event (with a large-scale loss of life) that the majority of eyewitnesses accurately recall central details.
Is pain suffering? A case study.
Black, Helen K
2007-01-01
In this article, the case study of an elderly woman shows how bodily pain and suffering meld in her narrative, not as the subjective and objective sides of the same event, but as distinct experiences in which both constructs emerge separately or come together based on the meaning she imputes to the event. The case study shows the clear methodological fit of qualitative narrative research with the lived experiences of pain and suffering. The narrator recalled the "tremendous" pain she experienced almost 60 years previously as both suffering and not-suffering, depending on the outcome of the circumstances that surrounded her pain. This case shows how a significant aspect of the aging experience-suffering-is medicalized, yet remains resistant to both categorization and medicine.
ERIC Educational Resources Information Center
Azevedo, Roger; Moos, Daniel C.; Johnson, Amy M.; Chauncey, Amber D.
2010-01-01
Self-regulated learning (SRL) with hypermedia environments involves a complex cycle of temporally unfolding cognitive and metacognitive processes that impact students' learning. We present several methodological issues related to treating SRL as an event and strengths and challenges of using online trace methodologies to detect, trace, model, and…
[Application of root cause analysis in healthcare].
Hsu, Tsung-Fu
2007-12-01
The main purpose of this study was to explore various aspects of root cause analysis (RCA), including its definition, rationale concept, main objective, implementation procedures, most common analysis methodology (fault tree analysis, FTA), and advantages and methodologic limitations in regard to healthcare. Several adverse events that occurred at a certain hospital were also analyzed by the author using FTA as part of this study. RCA is a process employed to identify basic and contributing causal factors underlying performance variations associated with adverse events. The rationale concept of RCA offers a systemic approach to improving patient safety that does not assign blame or liability to individuals. The four-step process involved in conducting an RCA includes: RCA preparation, proximate cause identification, root cause identification, and recommendation generation and implementation. FTA is a logical, structured process that can help identify potential causes of system failure before actual failures occur. Some advantages and significant methodologic limitations of RCA were discussed. Finally, we emphasized that errors stem principally from faults attributable to system design, practice guidelines, work conditions, and other human factors, which induce health professionals to make negligence or mistakes with regard to healthcare. We must explore the root causes of medical errors to eliminate potential RCA system failure factors. Also, a systemic approach is needed to resolve medical errors and move beyond a current culture centered on assigning fault to individuals. In constructing a real environment of patient-centered safety healthcare, we can help encourage clients to accept state-of-the-art healthcare services.
ERIC Educational Resources Information Center
Fallon, Barbara; Trocme, Nico; MacLaurin, Bruce; Sinha, Vandna; Black, Tara
2011-01-01
This paper describes the methodological changes that occurred across cycles of the Canadian Incidence Study of Reported Child Abuse and Neglect (CIS), specifically outlining the rationale for tracking investigations of families with children at risk of maltreatment in the CIS-2008 cycle. This paper also presents analysis of data from the CIS-2008…
Wang, Weihao; Xing, Zhihua
2014-01-01
Objective. Xingnaojing injection (XNJ) is a well-known traditional Chinese patent medicine (TCPM) for stroke. The aim of this study is to assess the efficacy of XNJ for stroke including ischemic stroke, intracerebral hemorrhage (ICH), and subarachnoid hemorrhage (SAH). Methods. An extensive search was performed within using eight databases up to November 2013. Randomized controlled trials (RCTs) on XNJ for treatment of stroke were collected. Study selection, data extraction, quality assessment, and meta-analysis were conducted according to the Cochrane standards, and RevMan5.0 was used for meta-analysis. Results. This review included 13 RCTs and a total of 1,514 subjects. The overall methodological quality was poor. The meta-analysis showed that XNJ combined with conventional treatment was more effective for total efficacy, neurological deficit improvement, and reduction of TNF-α levels compared with those of conventional treatment alone. Three trials reported adverse events, of these one trial reported mild impairment of kidney and liver function, whereas the other two studies failed to report specific adverse events. Conclusion. Despite the limitations of this review, we suggest that XNJ in combination with conventional medicines might be beneficial for the treatment of stroke. Currently there are various methodological problems in the studies. Therefore, high-quality, large-scale RCTs are urgently needed. PMID:24707306
Imaging screening of catastrophic neurological events using a software tool: preliminary results.
Fernandes, A P; Gomes, A; Veiga, J; Ermida, D; Vardasca, T
2015-05-01
In Portugal, as in most countries, the most frequent organ donors are brain-dead donors. To answer the increasing need for transplants, donation programs have been implemented. The goal is to recognize virtually all the possible and potential brain-dead donors admitted to hospitals. The aim of this work was to describe preliminary results of a software application designed to identify devastating neurological injury victims who may progress to brain death and can be possible organ donors. This was an observational, longitudinal study with retrospective data collection. The software application is an automatic algorithm based on natural language processing for selected keywords/expressions present in the cranio-encephalic computerized tomography (CE CT) scan reports to identify catastrophic neurological situations, with e-mail notification to the Transplant Coordinator (TC). The first 7 months of this application were analyzed and compared with the standard clinical evaluation methodology. The imaging identification tool showed a sensitivity of 77% and a specificity of 66%; predictive positive value (PPV) was 0.8 and predictive negative value (PNV) was 0.7 for the identification of catastrophic neurological events. The methodology proposed in this work seems promising in improving the screening efficiency of critical neurological events. Copyright © 2015 Elsevier Inc. All rights reserved.
Object permanence in marine mammals using the violation of expectation procedure.
Singer, Rebecca; Henderson, Elizabeth
2015-03-01
Object permanence refers to the ability to process information about objects even when they are not visible. One stage of object permanence, called visible displacement, involves being able to find an object that has been fully hidden from view. Visible displacement has been demonstrated in many animal species, yet very little is known about object permanence in marine mammals. In addition, the methodology for testing visible displacement has sometimes been called into question because alternative explanations could account for subjects' success. The current study investigated visible displacement in Atlantic bottlenose dolphins and California sea lions using a methodology called violation of expectation, in which the animal's fish bucket was placed on a table surrounded on three sides by curtains. A solid screen placed in front of the bucket was then rotated in an arc from front to back. The screen was rotated either 120° (possible event) or 180° (surprising event), appearing as if the bucket disappeared. Both dolphins and sea lions looked significantly longer during the 180°, unexpected, trials than the expected event trials. Results suggest that both dolphins and sea lions pass visible displacement tests without the use of perceptual cues. This article is part of a Special Issue entitled: Tribute to Tom Zentall. Copyright © 2014 Elsevier B.V. All rights reserved.
Kalia, Sumeet; Klar, Neil; Donner, Allan
2016-12-30
Cluster randomized trials (CRTs) involve the random assignment of intact social units rather than independent subjects to intervention groups. Time-to-event outcomes often are endpoints in CRTs. Analyses of such data need to account for the correlation among cluster members. The intracluster correlation coefficient (ICC) is used to assess the similarity among binary and continuous outcomes that belong to the same cluster. However, estimating the ICC in CRTs with time-to-event outcomes is a challenge because of the presence of censored observations. The literature suggests that the ICC may be estimated using either censoring indicators or observed event times. A simulation study explores the effect of administrative censoring on estimating the ICC. Results show that ICC estimators derived from censoring indicators or observed event times are negatively biased. Analytic work further supports these results. Observed event times are preferred to estimate the ICC under minimum frequency of administrative censoring. To our knowledge, the existing literature provides no practical guidance on the estimation of ICC when substantial amount of administrative censoring is present. The results from this study corroborate the need for further methodological research on estimating the ICC for correlated time-to-event outcomes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
Applied Use of Safety Event Occurrence Control Charts of Harm and Non-Harm Events: A Case Study.
Robinson, Susan N; Neyens, David M; Diller, Thomas
Most hospitals use occurrence reporting systems that facilitate identifying serious events that lead to root cause investigations. Thus, the events catalyze improvement efforts to mitigate patient harm. A serious limitation is that only a few of the occurrences are investigated. A challenge is leveraging the data to generate knowledge. The goal is to present a methodology to supplement these incident assessment efforts. The framework affords an enhanced understanding of patient safety through the use of control charts to monitor non-harm and harm incidents simultaneously. This approach can identify harm and non-harm reporting rates and also can facilitate monitoring occurrence trends. This method also can expedite identifying changes in workflow, processes, or safety culture. Although unable to identify root causes, this approach can identify changes in near real time. This approach also supports evaluating safety or policy interventions that may not be observable in annual safety climate surveys.
Risk assessment of oil and gas well drilling activities in Iran - a case study: human factors.
Amir-Heidari, Payam; Farahani, Hadi; Ebrahemzadih, Mehrzad
2015-01-01
Oil and gas well drilling activities are associated with numerous hazards which have the potential to cause injury or harm for people, property and the environment. These hazards are also a threat for the reputation of drilling companies. To prevent accidents and undesired events in drilling operations it is essential to identify, evaluate, assess and control the attendant risks. In this work, a structured methodology is proposed for risk assessment of drilling activities. A case study is performed to identify, analyze and assess the risks arising from human factors in one of the on shore drilling sites in southern Iran. A total of 17 major hazards were identified and analyzed using the proposed methodology. The results showed that the residual risks of 100% of these hazards were in the acceptable or transitional zone, and their levels were expected to be lowered further by proper controls. This structured methodology may also be used in other drilling sites and companies for assessing the risks.
Zysberg, Leehu; Kimhi, Shaul; Eshel, Yochanan
2013-01-01
This study examined the role of trust in the national armed and security forces in Israel as a potential protective factor in post-war stress symptoms, alongside other known correlates such as exposure to war events, sense of danger, and demographics. A cluster sample of 870 residents of the town of Kiryat-Shemona in Israel participated in this correlational study. The town was under heavy bombing during the second Lebanon war, and data collection took place about a year after the end of the war. Our results suggest that while sense of danger and exposure to war events are the strongest correlates of stress related symptoms, trust in the armed forces was negatively correlated with stress, even after controlling for demographics; therefore supporting our hypothesis. Theoretical, methodological and practical implications are discussed in light of our findings.
Syed, Hamzah; Jorgensen, Andrea L; Morris, Andrew P
2016-06-01
To evaluate the power to detect associations between SNPs and time-to-event outcomes across a range of pharmacogenomic study designs while comparing alternative regression approaches. Simulations were conducted to compare Cox proportional hazards modeling accounting for censoring and logistic regression modeling of a dichotomized outcome at the end of the study. The Cox proportional hazards model was demonstrated to be more powerful than the logistic regression analysis. The difference in power between the approaches was highly dependent on the rate of censoring. Initial evaluation of single-nucleotide polymorphism association signals using computationally efficient software with dichotomized outcomes provides an effective screening tool for some design scenarios, and thus has important implications for the development of analytical protocols in pharmacogenomic studies.
NASA Astrophysics Data System (ADS)
Velasco-Forero, Carlos A.; Sempere-Torres, Daniel; Cassiraga, Eduardo F.; Jaime Gómez-Hernández, J.
2009-07-01
Quantitative estimation of rainfall fields has been a crucial objective from early studies of the hydrological applications of weather radar. Previous studies have suggested that flow estimations are improved when radar and rain gauge data are combined to estimate input rainfall fields. This paper reports new research carried out in this field. Classical approaches for the selection and fitting of a theoretical correlogram (or semivariogram) model (needed to apply geostatistical estimators) are avoided in this study. Instead, a non-parametric technique based on FFT is used to obtain two-dimensional positive-definite correlograms directly from radar observations, dealing with both the natural anisotropy and the temporal variation of the spatial structure of the rainfall in the estimated fields. Because these correlation maps can be automatically obtained at each time step of a given rainfall event, this technique might easily be used in operational (real-time) applications. This paper describes the development of the non-parametric estimator exploiting the advantages of FFT for the automatic computation of correlograms and provides examples of its application on a case study using six rainfall events. This methodology is applied to three different alternatives to incorporate the radar information (as a secondary variable), and a comparison of performances is provided. In particular, their ability to reproduce in estimated rainfall fields (i) the rain gauge observations (in a cross-validation analysis) and (ii) the spatial patterns of radar fields are analyzed. Results seem to indicate that the methodology of kriging with external drift [KED], in combination with the technique of automatically computing 2-D spatial correlograms, provides merged rainfall fields with good agreement with rain gauges and with the most accurate approach to the spatial tendencies observed in the radar rainfall fields, when compared with other alternatives analyzed.
Riga, Marina; Vozikis, Athanassios; Pollalis, Yannis; Souliotis, Kyriakos
2015-04-01
The economic crisis in Greece poses the necessity to resolve problems concerning both the spiralling cost and the quality assurance in the health system. The detection and the analysis of patient adverse events and medical errors are considered crucial elements of this course. The implementation of MERIS embodies a mandatory module, which adopts the trigger tool methodology for measuring adverse events and medical errors an intensive care unit [ICU] environment, and a voluntary one with web-based public reporting methodology. A pilot implementation of MERIS running in a public hospital identified 35 adverse events, with approx. 12 additional hospital days and an extra healthcare cost of €12,000 per adverse event or of about €312,000 per annum for ICU costs only. At the same time, the voluntary module unveiled 510 reports on adverse events submitted by citizens or patients. MERIS has been evaluated as a comprehensive and effective system; it succeeded in detecting the main factors that cause adverse events and discloses severe omissions of the Greek health system. MERIS may be incorporated and run efficiently nationally, adapted to the needs and peculiarities of each hospital or clinic. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael
2010-01-01
We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.
Shifting between Third and First Person Points of View in EFL Narratives
ERIC Educational Resources Information Center
Shokouhi, Hossein; Daram, Mahmood; Sabah, Somayeh
2011-01-01
This article reports on the difference between points of view in narrating a short story. The EFL learners taking part in the control group were required to recount the events from the third person perspective and the subjects in the experimental group from the first person perspective. The methodological frame of the study was based on Koven's…
ERIC Educational Resources Information Center
Carandang, Carlo; Santor, Darcy; Gardner, David M.; Carrey, Normand; Kutcher, Stan
2007-01-01
The underlying proposition for any experimental/therapeutic trial is the uncertainty that the risks of treatment will be outweighed by its benefits. For some therapeutic interventions (e.g., exercise programs, vitamin supplementation), the potential for treatment-emergent adverse events may prima facie be low or negligible, whereas for others…
The development of an inherent safety approach to the prevention of domino accidents.
Cozzani, Valerio; Tugnoli, Alessandro; Salzano, Ernesto
2009-11-01
The severity of industrial accidents in which a domino effect takes place is well known in the chemical and process industry. The application of an inherent safety approach for the prevention of escalation events leading to domino accidents was explored in the present study. Reference primary scenarios were analyzed and escalation vectors were defined. Inherent safety distances were defined and proposed as a metric to express the intensity of the escalation vectors. Simple rules of thumb were presented for a preliminary screening of these distances. Swift reference indices for layout screening with respect to escalation hazard were also defined. Two case studies derived from existing layouts of oil refineries were selected to understand the potentialities coming from the application in the methodology. The results evidenced that the approach allows a first comparative assessment of the actual domino hazard in a layout, and the identification of critical primary units with respect to escalation events. The methodology developed also represents a useful screening tool to identify were to dedicate major efforts in the design of add-on measures, optimizing conventional passive and active measures for the prevention of severe domino accidents.
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
[Demonstrating patient safety requires acceptance of a broader scientific palette].
Leistikow, I
2017-01-01
It is high time the medical community recognised that patient-safety research can be assessed using other scientific methods than the traditional medical ones. There is often a fundamental mismatch between the methodology of patient-safety research and the methodology used to assess the quality of this research. One example is research into the reliability and validity of record review as a method for detecting adverse events. This type of research is based on logical positivism, while record review itself is based on social constructivism. Record review does not lead to "one truth": adverse events are not measured on the basis of the records themselves, but by weighing the probability of certain situations being classifiable as adverse events. Healthcare should welcome behavioural and social sciences to its scientific palette. Restricting ourselves to the randomised control trial paradigm is short-sighted and dangerous; it deprives patients of much-needed improvements in safety.
ERIC Educational Resources Information Center
Torres Valdés, Rosa María; Santa Soriano, Alba; Lorenzo Álvarez, Carolina
2018-01-01
This paper presents the findings of a training programme based on an Action-Research methodology that has been applied in two subjects of Event Organization, Protocol, and Institutional Relations undergraduate and Master's degrees. Through a teaching methodology called "learning by doing," students are encouraged to understand,…
Toward sensor-based context aware systems.
Sakurai, Yoshitaka; Takada, Kouhei; Anisetti, Marco; Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Tsuruta, Setsuo
2012-01-01
This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information.
The economic burden of patient safety targets in acute care: a systematic review
Mittmann, Nicole; Koo, Marika; Daneman, Nick; McDonald, Andrew; Baker, Michael; Matlow, Anne; Krahn, Murray; Shojania, Kaveh G; Etchells, Edward
2012-01-01
Background Our objective was to determine the quality of literature in costing of the economic burden of patient safety. Methods We selected 15 types of patient safety targets for our systematic review. We searched the literature published between 2000 and 2010 using the following terms: “costs and cost analysis,” “cost-effectiveness,” “cost,” and “financial management, hospital.” We appraised the methodologic quality of potentially relevant studies using standard economic methods. We recorded results in the original currency, adjusted for inflation, and then converted to 2010 US dollars for comparative purposes (2010 US$1.00 = 2010 €0.76). The quality of each costing study per patient safety target was also evaluated. Results We screened 1948 abstracts, and identified 158 potentially eligible studies, of which only 61 (39%) reported any costing methodology. In these 61 studies, we found wide estimates of the attributable costs of patient safety events ranging from $2830 to $10,074. In general hospital populations, the cost per case of hospital-acquired infection ranged from $2132 to $15,018. Nosocomial bloodstream infection was associated with costs ranging from $2604 to $22,414. Conclusion There are wide variations in the estimates of economic burden due to differences in study methods and methodologic quality. Greater attention to methodologic standards for economic evaluations in patient safety is needed. PMID:23097615
ERIC Educational Resources Information Center
Schnell, Jim
This paper describes the methodology employed to study videotapes of presentations made by President George Bush during the crisis in the Persian Gulf. Analysis of President Bush's language in relation to the events of the Gulf War was undertaken. Videotapes were used because they allowed for analysis of nonverbal communication as well as verbal…
Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies
Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong
2013-01-01
Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480
[Natural disasters and health: an analysis of the situation in Brazil].
Freitas, Carlos Machado de; Silva, Diego Ricardo Xavier; Sena, Aderita Ricarda Martins de; Silva, Eliane Lima; Sales, Luiz Belino Ferreira; Carvalho, Mauren Lopes de; Mazoto, Maíra Lopes; Barcellos, Christovam; Costa, André Monteiro; Oliveira, Mara Lúcia Carneiro; Corvalán, Carlos
2014-09-01
Natural disasters are still insufficiently studied and understood within the scope of public health in this country, with impacts in the short and long term. The scope of this article is to analyze the relationship between disasters and their impact on health based on disaster data recorded in the country. The methodology involved the systematization of data and information contained in the Brazilian Atlas of Natural Disasters 1991-2010 and directly from the National Department of Civil Defense (NSCD). Disasters were organized into four categories of events (meteorological; hydrological; climatological; geophysical/geological) and for each of the latter, the data for morbidity, mortality and exposure of those affected were examined, revealing different types of impacts. Three categories of disasters stood out: the hydrological events showed higher percentages of mortality, morbidity and exposure; climatological events had higher percentages of incidents and people affected; the geophysical/geological events had a higher average of exposure and deaths per event. Lastly, a more active participation of the health sector in the post-2015 global political agenda is proposed, particularly events related to sustainable development, climate change and disaster risk reduction.
Dretzke, Janine; Ensor, Joie; Bayliss, Sue; Hodgkinson, James; Lordkipanidzé, Marie; Riley, Richard D; Fitzmaurice, David; Moore, David
2014-12-03
Prognostic factors are associated with the risk of future health outcomes in individuals with a particular health condition. The prognostic ability of such factors is increasingly being assessed in both primary research and systematic reviews. Systematic review methodology in this area is continuing to evolve, reflected in variable approaches to key methodological aspects. The aim of this article was to (i) explore and compare the methodology of systematic reviews of prognostic factors undertaken for the same clinical question, (ii) to discuss implications for review findings, and (iii) to present recommendations on what might be considered to be 'good practice' approaches. The sample was comprised of eight systematic reviews addressing the same clinical question, namely whether 'aspirin resistance' (a potential prognostic factor) has prognostic utility relative to future vascular events in patients on aspirin therapy for secondary prevention. A detailed comparison of methods around study identification, study selection, quality assessment, approaches to analysis, and reporting of findings was undertaken and the implications discussed. These were summarised into key considerations that may be transferable to future systematic reviews of prognostic factors. Across systematic reviews addressing the same clinical question, there were considerable differences in the numbers of studies identified and overlap between included studies, which could only partially be explained by different study eligibility criteria. Incomplete reporting and differences in terminology within primary studies hampered study identification and selection process across reviews. Quality assessment was highly variable and only one systematic review considered a checklist for studies of prognostic questions. There was inconsistency between reviews in approaches towards analysis, synthesis, addressing heterogeneity and reporting of results. Different methodological approaches may ultimately affect the findings and interpretation of systematic reviews of prognostic research, with implications for clinical decision-making.
Shaw, Joanne M; O'Brien, Jane; Chua, Susan; De Boer, Richard; Dear, Rachel; Murray, Nicholas; Boyle, Fran
2018-01-01
Chemotherapy-induced alopecia is a common and distressing adverse event for patients. Scalp cooling to reduce this alopecia has been available in Europe for more than a decade, but only recently introduced in Australia. The aim of this study was to qualitatively explore health professionals' perceptions of the barriers and enablers to the implementation of scalp cooling in Australian cancer centres. Using a qualitative methodology, telephone interviews were conducted with 21 health professionals working in a tumour stream where chemotherapy-induced alopecia is an adverse event of treatment. Participants were recruited from five centres in Australia where scalp cooling is currently available and one centre without access to the technology. Four interrelated themes were identified: (1) health professional attitudes, (2) concerns for patient equity, (3) logistical considerations and (4) organisational support. This qualitative study provides the first methodological exploration of Australian health professionals' perceptions of barriers and enablers to scalp cooling uptake. The results highlighted health professional support drives the introduction of scalp cooling. Integration of the technology requires adjustments to nursing practice to manage the increased time, workload and change in patient flow. Strategies to manage the change in practice and organisational support for change in work flow are essential for successful implementation into routine care.
Valuation effects of health cost containment measures.
Strange, M L; Ezzell, J R
2000-01-01
This study reports the findings of research into the valuation effects of health cost containment activities by publicly traded corporations. The motivation for this study was employers' increasing cost of providing health care insurance to their employees and employers' efforts to contain those costs. A 1990 survey of corporate health benefits indicated that these costs represented 25 percent of employers' net earnings and this would rise by the year 2000 if no actions were taken to reduce cost. Health cost containment programs that are implemented by firms should be seen by shareholders as a wealth maximizing effort. As such, this should be reflected in share price. This study employed standard event study methodology where the event is a media announcement or report regarding an attempt by a firm to contain the costs of providing health insurance and other health related benefits to employees. It examined abnormal returns on a number of event days and for a number of event intervals. Of the daily and interval returns that are least significant at the 10 percent level, virtually all are negative. Cross-sectional analysis shows that the abnormal returns are related negatively to a unionization variable.
Can discrete event simulation be of use in modelling major depression?
Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard
2006-01-01
Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes. PMID:17147790
Can discrete event simulation be of use in modelling major depression?
Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard
2006-12-05
Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.
NEPP Update of Independent Single Event Upset Field Programmable Gate Array Testing
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Pellish, Jonathan
2017-01-01
This presentation provides a NASA Electronic Parts and Packaging (NEPP) Program update of independent Single Event Upset (SEU) Field Programmable Gate Array (FPGA) testing including FPGA test guidelines, Microsemi RTG4 heavy-ion results, Xilinx Kintex-UltraScale heavy-ion results, Xilinx UltraScale+ single event effect (SEE) test plans, development of a new methodology for characterizing SEU system response, and NEPP involvement with FPGA security and trust.
Personalized Clinical Diagnosis in Data Bases for Treatment Support in Phthisiology.
Lugovkina, T K; Skornyakov, S N; Golubev, D N; Egorov, E A; Medvinsky, I D
2016-01-01
The decision-making is a key event in the clinical practice. The program products with clinical decision support models in electronic data-base as well as with fixed decision moments of the real clinical practice and treatment results are very actual instruments for improving phthisiological practice and may be useful in the severe cases caused by the resistant strains of Mycobacterium tuberculosis. The methodology for gathering and structuring of useful information (critical clinical signals for decisions) is described. Additional coding of clinical diagnosis characteristics was implemented for numeric reflection of the personal situations. The created methodology for systematization and coding Clinical Events allowed to improve the clinical decision models for better clinical results.
Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution
ERIC Educational Resources Information Center
Teachman, Jay
2011-01-01
I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…
Single-Event Rapid Word Collection Workshops: Efficient, Effective, Empowering
ERIC Educational Resources Information Center
Boerger, Brenda H.; Stutzman, Verna
2018-01-01
In this paper we describe single-event Rapid Word Collection (RWC) workshop results in 12 languages, and compare these results to fieldwork lexicons collected by other means. We show that this methodology of collecting words by semantic domain by community engagement leads to obtaining more words in less time than conventional collection methods.…
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
Measuring Aggregation of Events about a Mass Using Spatial Point Pattern Methods
Smith, Michael O.; Ball, Jackson; Holloway, Benjamin B.; Erdelyi, Ferenc; Szabo, Gabor; Stone, Emily; Graham, Jonathan; Lawrence, J. Josh
2017-01-01
We present a methodology that detects event aggregation about a mass surface using 3-dimensional study regions with a point pattern and a mass present. The Aggregation about a Mass function determines aggregation, randomness, or repulsion of events with respect to the mass surface. Our method closely resembles Ripley’s K function but is modified to discern the pattern about the mass surface. We briefly state the definition and derivation of Ripley’s K function and explain how the Aggregation about a Mass function is different. We develop the novel function according to the definition: the Aggregation about a Mass function times the intensity is the expected number of events within a distance h of a mass. Special consideration of edge effects is taken in order to make the function invariant to the location of the mass within the study region. Significance of aggregation or repulsion is determined using simulation envelopes. A simulation study is performed to inform researchers how the Aggregation about a Mass function performs under different types of aggregation. Finally, we apply the Aggregation about a Mass function to neuroscience as a novel analysis tool by examining the spatial pattern of neurotransmitter release sites as events about a neuron. PMID:29046865
New Briefing Methodology for the Brazilian Study and Monitoring of Space Weather (Embrace) Program.
NASA Astrophysics Data System (ADS)
Dal Lago, A.; Cecatto, J. R.; Costa, J. E. R.; Da Silva, L. A.; Rockenbach, M.; Braga, C. R.; Mendonca, R. R. S.; Mendes, O., Jr.; Koga, D.; Alves, L. R.; Becker-Guedes, F.; Wrasse, C. M.; Takahashi, H.; Resende, L.; Banik de Padua, M.; De Nardin, C. M.
2016-12-01
The Brazilian Study and Monitoring of Space Weather (Embrace) Program is being conducted by the National Institute for Space Research (INPE, Brazil) since 2008. Among several activities of the EMBRACE program, there are weekly briefings, held since 2012, where an evaluation is made of all space weather events occurred in the past week. At the beginning, an intuitive methodology was used, in which scientists were invited to present their reports on their subjects of expertise: solar, interplanetary space, geomagnetism, ionosphere and upper atmosphere. Latter on, an additional subject was introduced, with the inclusion of a separate report on the earth's magnetosphere, with special attention to the dynamics of the earth's radiation belts. Since late 2015, the need for a more efficient methodology was felt by the EMBRACE program, inspired by practices long used in forecasting of metheorological weather and climate. In that sense, an atempt to develop scales of disturbances was made. The aim is to be able to faster represent the level of space weather activity in all reported subjects. A huge effort was put together to produce sound indices, based on statistical significance of occurrence of distinct levels. This methodology is partially under practical evaluation since early 2016. In this work we present a report on the progress of the new methodology for EMBRACE program briefing meetings.
Traditional Chinese medicine for knee osteoarthritis: An overview of systematic review.
Yang, Min; Jiang, Li; Wang, Qing; Chen, Hao; Xu, Guihua
2017-01-01
Traditional Chinese medicine (TCM) has been accepted as a complementary therapy for knee osteoarthritis. However, the efficacy and safety of the intervention were still conflicting and uncertain. Meanwhile, the quality of methodology and evidence in the field was unknown. To summarize the characteristics and critically evaluate the quality of methodology, as well as the evidence of systematic reviews (SRs) on TCM for knee osteoarthritis. Five electronic databases were searched from inception to April 2016. The methodological quality of the included studies was assessed by AMSTAR and ROBIS. The quality of the evidence was determined using the GRADE approach. Ten SRs were included. The conclusions suggest that TCM provides potential benefits for patients with knee osteoarthritis. These benefits include pain relief, functional improvement, and presence of few adverse events. Limitations of the methodological quality mainly included the lack of a-priori protocol or protocol registration and incomprehensive literature search. A list of excluded studies was also not provided. The overall quality of evidence in the SRs was poor, ranging from "very low" to "low," mainly because of the serious risk of bias of original trials, inconsistencies, and imprecision in the outcomes. TCM generally appears to be effective for knee osteoarthritis treatment. However, the evidence is not robust enough because of the methodological flaws in SRs. Hence, these conclusions on available SRs should be treated with caution for clinical practice.
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
Moore, Bethany; Bone, Eric A
2017-01-01
The concept of triage in healthcare has been around for centuries and continues to be applied today so that scarce resources are allocated according to need. A business impact analysis (BIA) is a form of triage in that it identifies which processes are most critical, which to address first and how to allocate limited resources. On its own, however, the BIA provides only a roadmap of the impacts and interdependencies of an event. When disaster strikes, organisational decision-makers often face difficult decisions with regard to allocating limited resources between multiple 'mission-critical' functions. Applying the concept of triage to business continuity provides those decision-makers navigating a rapidly evolving and unpredictable event with a path that protects the fundamental priorities of the organisation. A business triage methodology aids decision-makers in times of crisis by providing a simplified framework for decision-making based on objective, evidence-based criteria, which is universally accepted and understood. When disaster strikes, the survival of the organisation depends on critical decision-making and quick actions to stabilise the incident. This paper argues that organisations need to supplement BIA processes with a decision-making triage methodology that can be quickly applied during the chaos of an actual event.
Temperature dynamics of stormwater runoff in Australia and the USA.
Hathaway, J M; Winston, R J; Brown, R A; Hunt, W F; McCarthy, D T
2016-07-15
Thermal pollution of surface waters by urban stormwater runoff is an often overlooked by-product of urbanization. Elevated stream temperatures due to an influx of stormwater runoff can be detrimental to stream biota, in particular for cold water systems. However, few studies have examined temperature trends throughout storm events to determine how these thermal inputs are temporally distributed. In this study, six diverse catchments in two continents are evaluated for thermal dynamics. Summary statistics from the data showed larger catchments have lower maximum runoff temperatures, minimum runoff temperatures, and temperature variability. This reinforces the understanding that subsurface drainage infrastructure in urban catchments acts to moderate runoff temperatures. The catchments were also evaluated for the presence of a thermal first flush using two methodologies. Results showed the lack of a first flush under traditional assessment methodologies across all six catchments, supporting the results from a limited number of studies in literature. However, the time to peak temperature was not always coincident with the time to peak flow, highlighting the variability of thermal load over time. When a new first flush methodology was applied, significant differences in temperature were noted with increasing runoff depth for five of the six sites. This study is the first to identify a runoff temperature first flush, and highlights the need to carefully consider the appropriate methodology for such analyses. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis of economic and social costs of adverse events associated with blood transfusions in Spain.
Ribed-Sánchez, Borja; González-Gaya, Cristina; Varea-Díaz, Sara; Corbacho-Fabregat, Carlos; Bule-Farto, Isabel; Pérez de-Oteyza, Jaime
To calculate, for the first time, the direct and social costs of transfusion-related adverse events in order to include them in the National Healthcare System's budget, calculation and studies. In Spain more than 1,500 patients yearly are diagnosed with such adverse events. Blood transfusion-related adverse events recorded yearly in Spanish haemovigilance reports were studied retrospectively (2010-2015). The adverse events were coded according to the classification of Diagnosis-Related Groups. The direct healthcare costs were obtained from public information sources. The productivity loss (social cost) associated with adverse events was calculated using the human capital and hedonic salary methodologies. In 2015, 1,588 patients had adverse events that resulted in direct health care costs (4,568,914€) and social costs due to hospitalization (200,724€). Three adverse reactions resulted in patient death (at a social cost of 1,364,805€). In total, the cost of blood transfusion-related adverse events was 6,134,443€ in Spain. For the period 2010-2015: the trends show a reduction in the total amount of transfusions (2 vs. 1.91M€; -4.4%). The number of adverse events increased (822 vs. 1,588; +93%), as well as their related direct healthcare cost (3.22 vs. 4.57M€; +42%) and the social cost of hospitalization (110 vs 200M€; +83%). Mortality costs decreased (2.65 vs. 1.36M€; -48%). This is the first time that the costs of post-transfusion adverse events have been calculated in Spain. These new figures and trends should be taken into consideration in any cost-effectiveness study or trial of new surgical techniques or sanitary policies that influence blood transfusion activities. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
A Comprehensive Study of Mass Murder Precipitants and Motivations of Offenders.
Taylor, Melanie A
2018-02-01
Much speculation has been made in the media as to the causes of mass murder in the United States, yet little empirical research exists to verify factors leading to violence. Prior research primarily relies on case study methodologies or small data sets, but none have focused on the underlying issues observed in a comprehensive national sample. Data for the current study include 152 mass murders reported through the FBI's Supplementary Homicide Reports and USA Today from 2007 to 2011, which were then matched with media reports for each event. The current study shows that mass murders typically occur following a triggering event, are committed by non-strangers, and are rarely committed by persons with mental illnesses. A more realistic image of these incidents is critical, as misperceptions of offenders and case characteristics can improperly shape public policies.
Fluence-based and microdosimetric event-based methods for radiation protection in space
NASA Technical Reports Server (NTRS)
Curtis, Stanley B.; Meinhold, C. B. (Principal Investigator)
2002-01-01
The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report #137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/LET method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented.
Mauricio-Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane L; Sin, Gürkan
2015-05-15
The design of sewer system control is a complex task given the large size of the sewer networks, the transient dynamics of the water flow and the stochastic nature of rainfall. This contribution presents a generic methodology for the design of a self-optimising controller in sewer systems. Such controller is aimed at keeping the system close to the optimal performance, thanks to an optimal selection of controlled variables. The definition of an optimal performance was carried out by a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow given the probability of a future rain event. The methodology is successfully applied to design an optimising control strategy for a subcatchment area in Copenhagen. The results are promising and expected to contribute to the advance of the operation and control problem of sewer systems. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Casola, J.; Johanson, E.; Groth, P.; Snow, C.; Choate, A.
2012-12-01
Southeastern Pennsylvania Transportation Authority (SEPTA), with support from the Federal Transit Administration, has been investigating its agency's vulnerability to weather-related disruption and damages as a way to inform an overall adaptation strategy for climate variability and change. Exploiting daily rail service records maintained by SEPTA and observations from nearby weather stations, we have developed a methodology for quantifying the sensitivity of SEPTA's Manayunk/Norristown rail line to various weather events (e.g., snow storms, heat waves, heavy rainfall and flooding, tropical storms). For each type of event, sensitivity is equated to the frequency and extent of service disruptions associated with the event, and includes the identification of thresholds beyond which impacts are observed. In addition, we have estimated the monetary costs associated with repair and replacement of infrastructure following these events. Our results have facilitated discussions with SEPTA operational staff, who have outlined the institutional aspects of their preparation and response processes for these weather events. We envision the methodology as being useful for resource and infrastructure managers across the public and private sector, and potentially scalable to smaller or larger operations. There are several advantageous aspects of the method: 1) the quantification of sensitivity, and the coupling of that sensitivity to cost information, provides credible input to SEPTA decision-makers as they establish the priorities and level of investment associated with their adaptation actions for addressing extreme weather; 2) the method provides a conceptual foundation for estimating the magnitude, frequency, and costs of potential future impacts at a local scale, especially with regard to heat waves; 3) the sensitivity information serves as an excellent discussion tool, enabling further research and information gathering about institutional relationships and procedures. These relationships and procedures are critical to the effectiveness of preparation for and responses to extreme weather events, but are often not explicitly documented.
The study of insect blood-feeding behaviour. 2. Recording techniques and the use of flow charts.
Smith, J J; Friend, W G
1987-01-01
This paper continues a discussion of approaches and methodologies we have used in our studies of feeding in haematophagous insects. Described are techniques for directly monitoring behaviour: electrical recording of feeding behaviour via resistance changes in the food canal, optical methods for monitoring mouthpart activity, and a computer technique for behavioural event recording. Also described is the use of "flow charts" or "decision diagrams" to model interrelated sequences of behaviours.
A GIS-based time-dependent seismic source modeling of Northern Iran
NASA Astrophysics Data System (ADS)
Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza
2017-01-01
The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.
Anywaine, Zacchaeus; Abaasa, Andrew; Levin, Jonathan; Kasirye, Ronnie; Kamali, Anatoli; Grosskurth, Heiner; Munderi, Paula; Nunn, Andrew
2015-07-01
Cotrimoxazole (CTX) prophylaxis is recommended by the World Health Organisation for HIV infected persons. However, once HIV infected patients have commenced ART in resource limited settings, the benefits of continued CTX prophylaxis are not known. The few studies that investigated the safety of discontinuing CTX prophylaxis in these settings had limitations due to their design. COSTOP is a randomised double blind placebo controlled non-inferiority trial among HIV infected Ugandan adults stabilised on anti-retroviral treatment (ART). Participants with CD4 count of 250 or more cells/mm(3) are randomised to two arms: the intervention arm in which CTX is discontinued and the control arm in which CTX prophylaxis is continued. The study aims to assess whether the intervention regimen is not inferior, with respect to the incidence of pre-defined CTX-preventable events, to the control regimen and superior with respect to the incidence of haematological adverse events. Studies that have previously evaluated the safety of discontinuing CTX prophylaxis among HIV infected adults in resource limited settings have provided moderate to low quality evidence owing in part to methodological limitations. COSTOP is designed and conducted with sufficient rigour to answer this question. The results of the trial will assist in guiding policy recommendations. This paper describes the design and methodological considerations important for the conduct of CTX cessation studies. Copyright © 2015. Published by Elsevier Inc.
Rahman, Md Motiur; Alatawi, Yasser; Cheng, Ning; Qian, Jingjing; Peissig, Peggy L; Berg, Richard L; Page, David C; Hansen, Richard A
2017-12-01
The US Food and Drug Administration Adverse Event Reporting System (FAERS), a post-marketing safety database, can be used to differentiate brand versus generic safety signals. To explore the methods for identifying and analyzing brand versus generic adverse event (AE) reports. Public release FAERS data from January 2004 to March 2015 were analyzed using alendronate and carbamazepine as examples. Reports were classified as brand, generic, and authorized generic (AG). Disproportionality analyses compared reporting odds ratios (RORs) of selected known labeled serious adverse events stratifying by brand, generic, and AG. The homogeneity of these RORs was compared using the Breslow-Day test. The AG versus generic was the primary focus since the AG is identical to brand but marketed as a generic, therefore minimizing generic perception bias. Sensitivity analyses explored how methodological approach influenced results. Based on 17,521 US event reports involving alendronate and 3733 US event reports involving carbamazepine (immediate and extended release), no consistently significant differences were observed across RORs for the AGs versus generics. Similar results were obtained when comparing reporting patterns over all time and just after generic entry. The most restrictive approach for classifying AE reports yielded smaller report counts but similar results. Differentiation of FAERS reports as brand versus generic requires careful attention to risk of product misclassification, but the relative stability of findings across varying assumptions supports the utility of these approaches for potential signal detection.
Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L
2014-01-01
Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.
Methodological considerations for studying social processes.
Patterson, Barbara; Morin, Karen
2012-01-01
To discuss the nature of and considerations in the study of social processes. Social processes include the elements of time, change and human interaction and many phenomena of interest to nurse researchers. Despite the significance of social processes for nursing practice and the labelling in many studies of phenomena as processes, there seems to be an inability to describe processes fully. The paper includes a presentation of two methodological approaches for illuminating the dynamics of social processes: participant observation and prospective time-series designs. Strengths and limitations of the two paradigmatically different approaches are offered. The method an investigator chooses should be considered selectively and appropriately according to the nature of the problem, what is known about the phenomena to be studied, and the investigator's world view and theoretical perspective. The conceptualisation of process can also influence the methodological choice. Capturing a social process in its entirety with either a qualitative or quantitative approach can be a difficult task. The focus of this paper is an initiation and expansion of the dialogue about which methods provide the best insight into social processes. This knowledge may offer opportunities for nurse researchers to design and implement interventions for individuals as they progress through life events.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Emotional reactivity to daily events in youth with anxiety disorders.
Herres, Joanna; Caporino, Nicole E; Cummings, Colleen M; Kendall, Philip C
2018-05-07
Although research supports associations between anxiety and emotional reactivity in adults (Cisler, J. M., Olatunji, B. O., Feldner, M. T., & Forsyth, J. P. (2010). Emotion regulation and the anxiety disorders: an integrative review. Journal of Psychopathology and Behavioral Assessment, 32(1), 68-82.), few studies have examined emotional reactivity in anxious youth (e.g., Carthy et al., 2010; Tan, P. Z., Forbes, E. E., Dahl, R. E., Ryan, N. D., Siegle, G. J., Ladouceur, C. D., & Silk, J. S. (2012). Emotional reactivity and regulation in anxious and nonanxious youth: a cell-phone ecological momentary assessment study. Journal of Child Psychology and Psychiatry, 53(2), 197-206.). Using daily diary methodology, this study examined both negative affect (NA) and positive affect (PA) reactivity to daily events in youth diagnosed with anxiety (N = 68; 60% female; 78% non-Hispanic White; M age = 11.18 years, SD = 3.17). We also examined whether parent-reported emotion regulation would predict emotional reactivity. Participants reported more NA on days they experienced more negative parent and teacher events and less PA on days that they experienced more negative peer events. Additionally, better emotion regulation was associated with less NA reactivity to negative teacher events and to both negative and positive academic events. Interpersonal events have a salient effect on daily affect for anxious youth. Youth anxiety therapists should target emotion regulation associated with negative events involving adults and address barriers to developing and maintaining positive peer relationships.
ERIC Educational Resources Information Center
Taft, Laritza M.
2010-01-01
In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…
ERIC Educational Resources Information Center
Elhai, Jon D.; Engdahl, Ryan M.; Palmieri, Patrick A.; Naifeh, James A.; Schweinle, Amy; Jacobs, Gerard A.
2009-01-01
The authors examined the effects of a methodological manipulation on the Posttraumatic Stress Disorder (PTSD) Checklist's factor structure: specifically, whether respondents were instructed to reference a single worst traumatic event when rating PTSD symptoms. Nonclinical, trauma-exposed participants were randomly assigned to 1 of 2 PTSD…
NASA Astrophysics Data System (ADS)
Pellarin, Thierry; Brocca, Luca; Crow, Wade; Kerr, Yann; Massari, Christian; Román-Cascón, Carlos; Fernández, Diego
2017-04-01
Recent studies have demonstrated the usefulness of soil moisture retrieved from satellite for improving rainfall estimations of satellite based precipitation products (SBPP). The real-time version of these products are known to be biased from the real precipitation observed at the ground. Therefore, the information contained in soil moisture can be used to correct the inaccuracy and uncertainty of these products, since the value and behavior of this soil variable preserve the information of a rain event even for several days. In this work, we take advantage of the soil moisture data from the Soil Moisture and Ocean Salinity (SMOS) satellite, which provides information with a quite appropriate temporal and spatial resolution for correcting rainfall events. Specifically, we test and compare the ability of three different methodologies for this aim: 1) SM2RAIN, which directly relate changes in soil moisture to rainfall quantities; 2) The LMAA methodology, which is based on the assimilation of soil moisture in two models of different complexity (see EGU2017-5324 in this same session); 3) The SMART method, based on the assimilation of soil moisture in a simple hydrological model with a different assimilation/modelling technique. The results are tested for 6 years over 10 sites around the world with different features (land surface, rainfall climatology, orography complexity, etc.). These preliminary and promising results are shown here for the first time to the scientific community, as also the observed limitations of the different methodologies. Specific remarks on the technical configurations, filtering/smoothing of SMOS soil moisture or re-scaling techniques are also provided from the results of different sensitivity experiments.
Luo, Jing; Xu, Hao; Yang, Guoyan; Qiu, Yu; Liu, Jianping; Chen, Keji
2014-08-01
Oral Chinese proprietary medicine (CPM) is commonly used to treat angina pectoris, and many relevant systematic reviews/meta-analyses are available. However, these reviews have not been systematically summarized and evaluated. We conducted an overview of these reviews, and explored their methodological and reporting quality to inform both practice and further research. We included systematic reviews/meta-analyses on oral CPM in treating angina until March 2013 by searching PubMed, Embase, the Cochrane Library and four Chinese databases. We extracted data according to a pre-designed form, and assessed the methodological and reporting characteristics of the reviews in terms of AMSTAR and PRISMA respectively. Most of the data analyses were descriptive. 36 systematic reviews/meta-analyses involving over 82,105 participants with angina reviewing 13 kinds of oral CPM were included. The main outcomes assessed in the reviews were surrogate outcomes (34/36, 94.4%), adverse events (31/36, 86.1%), and symptoms (30/36, 83.3%). Six reviews (6/36, 16.7%) drew definitely positive conclusions, while the others suggested potential benefits in the symptoms, electrocardiogram, and adverse events. The overall methodological and reporting quality of the reviews was limited, with many serious flaws such as the lack of review protocol and incomprehensive literature searches. Though many systematic reviews/meta-analyses on oral CPM for angina suggested potential benefits or definitely positive effects, stakeholders should interpret the findings of these reviews with caution, considering the overall limited methodological and reporting quality. We recommend further studies should be appropriately conducted and systematic reviews reported according to PRISMA standard. Copyright © 2014 Elsevier Ltd. All rights reserved.
Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept
NASA Astrophysics Data System (ADS)
Zhang, Yimin
This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.
Sadatsafavi, Mohsen; Xie, Hui; Etminan, Mahyar; Johnson, Kate; FitzGerald, J Mark
2018-01-01
There is minimal evidence on the extent to which the occurrence of a severe acute exacerbation of COPD that results in hospitalization affects the subsequent disease course. Previous studies on this topic did not generate causally-interpretable estimates. Our aim was to use corrected methodology to update previously reported estimates of the associations between previous and future exacerbations in these patients. Using administrative health data in British Columbia, Canada (1997-2012), we constructed a cohort of patients with at least one severe exacerbation, defined as an episode of inpatient care with the main diagnosis of COPD based on international classification of diseases (ICD) codes. We applied a random-effects 'joint frailty' survival model that is particularly developed for the analysis of recurrent events in the presence of competing risk of death and heterogeneity among individuals in their rate of events. Previous severe exacerbations entered the model as dummy-coded time-dependent covariates, and the model was adjusted for several observable patient and disease characteristics. 35,994 individuals (mean age at baseline 73.7, 49.8% female, average follow-up 3.21 years) contributed 34,271 severe exacerbations during follow-up. The first event was associated with a hazard ratio (HR) of 1.75 (95%CI 1.69-1.82) for the risk of future severe exacerbations. This risk decreased to HR = 1.36 (95%CI 1.30-1.42) for the second event and to 1.18 (95%CI 1.12-1.25) for the third event. The first two severe exacerbations that occurred during follow-up were also significantly associated with increased risk of all-cause mortality. There was substantial heterogeneity in the individual-specific rate of severe exacerbations. Even after adjusting for observable characteristics, individuals in the 97.5th percentile of exacerbation rate had 5.6 times higher rate of severe exacerbations than those in the 2.5th percentile. Using robust statistical methodology that controlled for heterogeneity in exacerbation rates among individuals, we demonstrated potential causal associations among past and future severe exacerbations, albeit the magnitude of association was noticeably lower than previously reported. The prevention of severe exacerbations has the potential to modify the disease trajectory.
Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M
2018-05-16
Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Skjæret, Nina; Nawaz, Ather; Morat, Tobias; Schoene, Daniel; Helbostad, Jorunn Lægdheim; Vereijken, Beatrix
2016-01-01
There has been a rapid increase in research on the use of virtual reality (VR) and gaming technology as a complementary tool in exercise and rehabilitation in the elderly population. Although a few recent studies have evaluated their efficacy, there is currently no in-depth description and discussion of different game technologies, physical functions targeted, and safety issues related to older adults playing exergames. This integrative review provides an overview of the technologies and games used, progression, safety measurements and associated adverse events, adherence to exergaming, outcome measures used, and their effect on physical function. We undertook systematic searches of SCOPUS and PubMed databases. Key search terms included "game", "exercise", and "aged", and were adapted to each database. To be included, studies had to involve older adults aged 65 years or above, have a pre-post training or intervention design, include ICT-implemented games with weight-bearing exercises, and have outcome measures that included physical activity variables and/or clinical tests of physical function. Sixty studies fulfilled the inclusion criteria. The studies had a broad range of aims and intervention designs and mostly focused on community-dwelling healthy older adults. The majority of the studies used commercially available gaming technologies that targeted a number of different physical functions. Most studies reported that they had used some form of safety measure during intervention. None of the studies reported serious adverse events. However, only 21 studies (35%) reported on whether adverse events occurred. Twenty-four studies reported on adherence, but only seven studies (12%) compared adherence to exergaming with other forms of exercise. Clinical measures of balance were the most frequently used outcome measures. PEDro scores indicated that most studies had several methodological problems, with only 4 studies fulfilling 6 or more criteria out of 10. Several studies found positive effects of exergaming on balance and gait, while none reported negative effects. Exergames show promise as an intervention to improve physical function in older adults, with few reported adverse events. As there is large variability between studies in terms of intervention protocols and outcome measures, as well as several methodological limitations, recommendations for both practice and further research are provided in order to successfully establish exergames as an exercise and rehabilitation tool for older adults. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The KULTURisk Regional Risk Assessment methodology for flood risk: the case of Sihl river in Zurich
NASA Astrophysics Data System (ADS)
Ronco, Paolo; Bullo, Martina; Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Buchecker, Matthias; Marcomini, Antonio
2014-05-01
In recent years, the frequency of catastrophes induced by natural hazard has increased and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing as a consequence of many factors both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (elements potentially at risk in floodplains area) and vulnerability (i.e. economic, social, geographic, cultural, and physical/environmental characteristics of the exposure). Besides these factors, the strong effect of climate change is projected to radically modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events both at local, regional and global scale. Within this context, it is necessary to develop effective and pro-active strategies, tools and actions which allow to assess and (possibly) to reduce the risk of floods. In light of the recent European Flood Directive (FD), the KULTURisk-FP7 Project developed a state-of-the-art Regional Risk Assessment (RRA) methodology for assessing the risk imposed by floods events. The KULTURisk RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It is a flexible that can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale) that integrates the outputs of various hydrodynamics models (hazard) with sito-specific geophysical and socio-economic indicators (exposure and vulnerability factors such as land cover, slope, soil permeability, population density, economic activities, etc.). The main outputs of the methodology are GIS-based risk maps that identify and prioritize relative hot-spot areas and targets at risk (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) in the considered region by comparing the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned. Risk maps, along with related statistics, provide crucial information about flood risk pattern, and allow the development of relevant and strategic mitigation and prevention measures to minimizing flood risk in urban areas. The present study applied and validated the KULTURisk RRA methodology to the Sihl river case study in Zurich (Switzerland). Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl river valley, which represents a typical case of river flooding in urban area. The total risk maps obtained under a 300 years return period scenario (selected as the reference one) have highlighted that the area is associated with the lower class of risk. Moreover, the relative risk is higher in Zurich city centre, in the few residential areas around the city centre and within the districts that rely just beside to the Sihl river course.
Hwang, Thomas J
2013-01-01
For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: -2.3, 13.4%; P = 0.02) for positive events and -2.0% (95% CI: -9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: -3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were -1.7% (95% CI: -9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than abnormal returns due to positive events, suggesting asymmetric market reactions.
Is shared misery double misery?
Mervin, Merehau Cindy; Frijters, Paul
2014-04-01
The literature has shown strong associations between health, financial and social life events and mental health. However, no studies as yet have looked at the temporal nature of the effects of life events on stated mental health nor have they included the effects of the events befalling partners within a household. This paper looks at the spillover in mental health, measured with the SF-36 scale, from one partner to the other, using life events to identify this relationship. We propose a new model that allows for both a temporal spacing of effects (anticipation and adaptation) as well as a spillover factor, which we define as the degree to which the events that are experienced by the partner affect us in the same way as if these events were to happen to us. We use data from 51,380 person-year observations of the Household, Income and Labour Dynamics in Australia survey (2002-10) which consistently measures nine distinct events, including illnesses, social shocks and financial shocks. We find that the events befalling a partner on average have an effect about 15% as large as the effect of own events. We use the estimates to compute the compensation required to offset own and partner's life events. The methodology in this paper is potentially useful for estimating other spillover parameters such as the effects of others in the family or in the neighbourhood. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric
2017-04-01
Rain time series records are generally studied using rainfall rate or accumulation parameters, which are estimated for a fixed duration (typically 1 min, 1 h or 1 day). In this study we use the concept of rain events
. The aim of the first part of this paper is to establish a parsimonious characterization of rain events, using a minimal set of variables selected among those normally used for the characterization of these events. A methodology is proposed, based on the combined use of a genetic algorithm (GA) and self-organizing maps (SOMs). It can be advantageous to use an SOM, since it allows a high-dimensional data space to be mapped onto a two-dimensional space while preserving, in an unsupervised manner, most of the information contained in the initial space topology. The 2-D maps obtained in this way allow the relationships between variables to be determined and redundant variables to be removed, thus leading to a minimal subset of variables. We verify that such 2-D maps make it possible to determine the characteristics of all events, on the basis of only five features (the event duration, the peak rain rate, the rain event depth, the standard deviation of the rain rate event and the absolute rain rate variation of the order of 0.5). From this minimal subset of variables, hierarchical cluster analyses were carried out. We show that clustering into two classes allows the conventional convective and stratiform classes to be determined, whereas classification into five classes allows this convective-stratiform classification to be further refined. Finally, our study made it possible to reveal the presence of some specific relationships between these five classes and the microphysics of their associated rain events.
Mobile mapping of sporting event spectators using bluetooth sensors: tour of flanders 2011.
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-10-22
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time.
Mobile Mapping of Sporting Event Spectators Using Bluetooth Sensors: Tour of Flanders 2011
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-01-01
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time. PMID:23202044
Jani, Vinod; Sonavane, Uddhavesh; Joshi, Rajendra
2016-07-01
Protein folding is a multi-micro second time scale event and involves many conformational transitions. Crucial conformational transitions responsible for biological functions of biomolecules are difficult to capture using current state-of-the-art molecular dynamics (MD) simulations. Protein folding, being a stochastic process, witnesses these transitions as rare events. Many new methodologies have been proposed for observing these rare events. In this work, a temperature-aided cascade MD is proposed as a technique for studying the conformational transitions. Folding studies for Engrailed homeodomain and Immunoglobulin domain B of protein A have been carried out. Using this methodology, the unfolded structures with RMSD of 20 Å were folded to a structure with RMSD of 2 Å. Three sets of cascade MD runs were carried out using implicit solvation, explicit solvation, and charge updation scheme. In the charge updation scheme, charges based on the conformation obtained are calculated and are updated in the topology file. In all the simulations, the structure of 2 Å was reached within a few nanoseconds using these methods. Umbrella sampling has been performed using snapshots from the temperature-aided cascade MD simulation trajectory to build an entire conformational transition pathway. The advantage of the method is that the possible pathways for a particular reaction can be explored within a short duration of simulation time and the disadvantage is that the knowledge of the start and end state is required. The charge updation scheme adds the polarization effects in the force fields. This improves the electrostatic interaction among the atoms, which may help the protein to fold faster.
Knowlton, Kim; Rotkin-Ellman, Miriam; Geballe, Linda; Max, Wendy; Solomon, Gina M
2011-11-01
The future health costs associated with predicted climate change-related events such as hurricanes, heat waves, and floods are projected to be enormous. This article estimates the health costs associated with six climate change-related events that struck the United States between 2000 and 2009. The six case studies came from categories of climate change-related events projected to worsen with continued global warming-ozone pollution, heat waves, hurricanes, infectious disease outbreaks, river flooding, and wildfires. We estimate that the health costs exceeded $14 billion, with 95 percent due to the value of lives lost prematurely. Actual health care costs were an estimated $740 million. This reflects more than 760,000 encounters with the health care system. Our analysis provides scientists and policy makers with a methodology to use in estimating future health costs related to climate change and highlights the growing need for public health preparedness.
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.
Oral and sublingual immunotherapy for egg allergy.
Romantsik, Olga; Tosca, Maria Angela; Zappettini, Simona; Calevo, Maria Grazia
2018-04-20
Clinical egg allergy is a common food allergy. Current management relies upon strict allergen avoidance. Oral immunotherapy might be an optional treatment, through desensitization to egg allergen. To determine the efficacy and safety of oral and sublingual immunotherapy in children and adults with immunoglobulin E (IgE)-mediated egg allergy as compared to a placebo treatment or an avoidance strategy. We searched 13 databases for journal articles, conference proceedings, theses and trials registers using a combination of subject headings and text words (last search 31 March 2017). We included randomized controlled trials (RCTs) comparing oral immunotherapy or sublingual immunotherapy administered by any protocol with placebo or an elimination diet. Participants were children or adults with clinical egg allergy. We retrieved 97 studies from the electronic searches. We selected studies, extracted data and assessed the methodological quality. We attempted to contact the study investigators to obtain the unpublished data, wherever possible. We used the I² statistic to assess statistical heterogeneity. We estimated a pooled risk ratio (RR) with 95% confidence interval (CI) for each outcome using a Mantel-Haenzel fixed-effect model if statistical heterogeneity was low (I² value less than 50%). We rated the quality of evidence for all outcomes using GRADE. We included 10 RCTs that met our inclusion criteria, that involved a total of 439 children (oral immunotherapy 249; control intervention 190), aged 1 year to 18 years. Each study used a different oral immunotherapy protocol; none used sublingual immunotherapy. Three studies used placebo and seven used an egg avoidance diet as the control. Primary outcomes were: an increased amount of egg that can be ingested and tolerated without adverse events while receiving allergen-specific oral immunotherapy or sublingual immunotherapy, compared to control; and a complete recovery from egg allergy after completion of oral immunotherapy or sublingual immunotherapy, compared to control. Most children (82%) in the oral immunotherapy group could ingest a partial serving of egg (1 g to 7.5 g) compared to 10% of control group children (RR 7.48, 95% CI 4.91 to 11.38; RD 0.73, 95% CI 0.67 to 0.80). Fewer than half (45%) of children receiving oral immunotherapy were able to tolerate a full serving of egg compared to 10% of the control group (RR 4.25, 95% CI 2.77 to 6.53; RD 0.35, 95% CI 0.28 to 0.43). All 10 trials reported numbers of children with serious adverse events (SAEs) and numbers of children with mild-to-severe adverse events. SAEs requiring epinephrine/adrenaline presented in 21/249 (8.4%) of children in the oral immunotherapy group, and none in the control group. Mild-to-severe adverse events were frequent; 75% of children presented mild-to-severe adverse events during oral immunotherapy treatment versus 6.8% of the control group (RR 8.35, 95% CI 5.31 to 13.12). Of note, seven studies used an egg avoidance diet as the control. Adverse events occurred in 4.2% of children, which may relate to accidental ingestion of egg-containing food. Three studies used a placebo control with adverse events present in 2.6% of children. Overall, there was inconsistent methodological rigour in the trials. All studies enrolled small numbers of children and used different methods to provide oral immunotherapy. Eight included studies were judged to be at high risk of bias in at least one domain. Furthermore, the quality of evidence was judged to be low due to small numbers of participants and events, and possible biases. Frequent and increasing exposure to egg over one to two years in people who are allergic to egg builds tolerance, with almost everyone becoming more tolerant compared with a minority in the control group and almost half of people being totally tolerant of egg by the end of treatment compared with 1 in 10 people who avoid egg. However, nearly all who received treatment experienced adverse events, mainly allergy-related. We found that 1 in 12 children had serious allergic reactions requiring adrenaline, and some people gave up oral immunotherapy. It appears that oral immunotherapy for egg allergy is effective, but confidence in the trade-off between benefits and harms is low; because there was a small number of trials with few participants, and methodological problems with some trials.
From event analysis to global lessons: disaster forensics for building resilience
NASA Astrophysics Data System (ADS)
Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard
2016-04-01
With unprecedented growth in disaster risk, there is an urgent need for enhanced learning about and understanding disasters, particularly in relation to the trends in the drivers of increasing risk. Building on the disaster forensics field, we introduce the Post Event Review Capability (PERC) methodology for systematically and holistically analyzing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalizable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilize the freely available PERC approach and contribute to building a repository of learnings on disaster risk management and resilience.
From event analysis to global lessons: disaster forensics for building resilience
NASA Astrophysics Data System (ADS)
Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard
2016-07-01
With unprecedented growth in disaster risk, there is an urgent need for enhanced learning and understanding of disasters, particularly in relation to the trends in drivers of increasing risk. Building on the disaster forensics field, we introduce the post-event review capability (PERC) methodology for systematically and holistically analysing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalisable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilise the freely available PERC approach and contribute to building a repository of learning on disaster risk management and resilience.
Improving the Accuracy of Estimation of Climate Extremes
NASA Astrophysics Data System (ADS)
Zolina, Olga; Detemmerman, Valery; Trenberth, Kevin E.
2010-12-01
Workshop on Metrics and Methodologies of Estimation of Extreme Climate Events; Paris, France, 27-29 September 2010; Climate projections point toward more frequent and intense weather and climate extremes such as heat waves, droughts, and floods, in a warmer climate. These projections, together with recent extreme climate events, including flooding in Pakistan and the heat wave and wildfires in Russia, highlight the need for improved risk assessments to help decision makers and the public. But accurate analysis and prediction of risk of extreme climate events require new methodologies and information from diverse disciplines. A recent workshop sponsored by the World Climate Research Programme (WCRP) and hosted at United Nations Educational, Scientific and Cultural Organization (UNESCO) headquarters in France brought together, for the first time, a unique mix of climatologists, statisticians, meteorologists, oceanographers, social scientists, and risk managers (such as those from insurance companies) who sought ways to improve scientists' ability to characterize and predict climate extremes in a changing climate.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
An Overview of Recent Advances in Event-Triggered Consensus of Multiagent Systems.
Ding, Lei; Han, Qing-Long; Ge, Xiaohua; Zhang, Xian-Ming
2018-04-01
Event-triggered consensus of multiagent systems (MASs) has attracted tremendous attention from both theoretical and practical perspectives due to the fact that it enables all agents eventually to reach an agreement upon a common quantity of interest while significantly alleviating utilization of communication and computation resources. This paper aims to provide an overview of recent advances in event-triggered consensus of MASs. First, a basic framework of multiagent event-triggered operational mechanisms is established. Second, representative results and methodologies reported in the literature are reviewed and some in-depth analysis is made on several event-triggered schemes, including event-based sampling schemes, model-based event-triggered schemes, sampled-data-based event-triggered schemes, and self-triggered sampling schemes. Third, two examples are outlined to show applicability of event-triggered consensus in power sharing of microgrids and formation control of multirobot systems, respectively. Finally, some challenging issues on event-triggered consensus are proposed for future research.
ERIC Educational Resources Information Center
Abramov, Ruslan A.; Sokolov, Maxim S.
2016-01-01
Relevance of the study lies in the fact that modern higher education in the Russian Federation are increasingly approaching the critical state--despite attempts to reform and use of successful foreign practices, our country is still lagging behind in the role. The aim of the article is the formation on the events that occurred in the country over…
Willems, Sjw; Schat, A; van Noorden, M S; Fiocco, M
2018-02-01
Censored data make survival analysis more complicated because exact event times are not observed. Statistical methodology developed to account for censored observations assumes that patients' withdrawal from a study is independent of the event of interest. However, in practice, some covariates might be associated to both lifetime and censoring mechanism, inducing dependent censoring. In this case, standard survival techniques, like Kaplan-Meier estimator, give biased results. The inverse probability censoring weighted estimator was developed to correct for bias due to dependent censoring. In this article, we explore the use of inverse probability censoring weighting methodology and describe why it is effective in removing the bias. Since implementing this method is highly time consuming and requires programming and mathematical skills, we propose a user friendly algorithm in R. Applications to a toy example and to a medical data set illustrate how the algorithm works. A simulation study was carried out to investigate the performance of the inverse probability censoring weighted estimators in situations where dependent censoring is present in the data. In the simulation process, different sample sizes, strengths of the censoring model, and percentages of censored individuals were chosen. Results show that in each scenario inverse probability censoring weighting reduces the bias induced in the traditional Kaplan-Meier approach where dependent censoring is ignored.
[Period-tripling in Multiscale Physical and Biological Events].
Bondar, A T; Fedorov, M V; Kolombet, V A
2015-01-01
A recent paper by S.J. Puetz et al. (Chaos, Solitons -& Fractals, v. 62-63, p. 55, 2014) described a fundamental period-tripled model. It involves periods of different astronomical (quasars, Sun), geophysical (geomagnetic, climatic, volcanic) and some biological processes. This work contains statistics for sixteen pairs of a period-tripled sequence. These periods range from -50 years to 1.5 billion years and no signs of the timescale limitations are found. We believe that the universal scope of the fundamental period-tripled model can be used for the development of new methodology of research data analysis: the main idea is that the spectrum of the periods of the studied event should be tested for the similarity with the spectrum of fundamental period-tripling pattern (because of the fundamental nature of the period-tripled model). Using this method, in this study we complement an already described period-tripled model with periods of human memory performance ranging from one minute to one month also adding seven relevant periods/frequencies of the period-tripled model in the range of human hearing. We make a conclusion that these characteristic frequencies may form the basis for music and singing phenomena. The new methodology is particularly appropriate for being applied in medicine and engineering.
Sanger, Kevanne Louise; Dorjee, Dusana
2015-09-01
Mindfulness training is increasingly being introduced in schools, yet studies examining its impact on the developing brain have been scarce. A neurodevelopmental perspective on mindfulness has been advocated as a powerful tool to enhance our understanding of underlying neurocognitive changes that have implications for developmental well-being research and the implementation of mindfulness in education. To stimulate more research in the developmental cognitive neuroscience of mindfulness, this article outlines possible indexes of mindfulness-based change in adolescence, with a focus on event-related brain potential (ERP) markers. We provide methodological recommendations for future studies and offer examples of research paradigms. We also discuss how mindfulness practice could impact on the development of prefrontal brain structures and enhance attention control and emotion regulation skills in adolescents, impacting in turn on their self-regulation and coping skills. We highlight advantages of the ERP methodology in neurodevelopmental research of mindfulness. It is proposed that research using established experimental tasks targeting ERP components such as the contingent negative variability, N200, error-related negativity and error positivity, P300, and late positive potential could elucidate developmentally salient shifts in the neural plasticity of the adolescent brain induced by mindfulness practice.
Identification of novel loci for the generation of reporter mice
Rebecchi, Monica; Levandis, Giovanna
2017-01-01
Abstract Deciphering the etiology of complex pathologies at molecular level requires longitudinal studies encompassing multiple biochemical pathways (apoptosis, proliferation, inflammation, oxidative stress). In vivo imaging of current reporter animals enabled the spatio-temporal analysis of specific molecular events, however, the lack of a multiplicity of loci for the generalized and regulated expression of the integrated transgenes hampers the creation of systems for the simultaneous analysis of more than a biochemical pathways at the time. We here developed and tested an in vivo-based methodology for the identification of multiple insertional loci suitable for the generation of reliable reporter mice. The validity of the methodology was tested with the generation of novel mice useful to report on inflammation and oxidative stress. PMID:27899606
Systematic Review: FDA-Approved Prescription Medications for Adults With Constipation
Lacy, Brian E.
2006-01-01
Constipation is a common, often chronic, gastrointestinal disorder that can negatively impact the lives of those it affects and can be difficult to treat satisfactorily. The objective of this systematic review is to identify and analyze the available published literature on US Food and Drug Administration–approved prescription therapies for adults with constipation (episodic and chronic) and to assess their place in therapy, based on the methodologic strength and results of identified clinical trials. Ovid MEDLINE, PubMed, and EMBASE databases were used to search the published literature. Studies were included if they were randomized and prospective, conducted in adults (age ≥18), published as full-length manuscripts in English, and compared the test agent with placebo or a comparator(s). Studies were excluded if they involved patients with constipation attributed to secondary causes. Because fully published manuscripts from phase III efficacy trials involving the recently approved medication lubiprostone were not available, a manual search was performed of abstracts from the two annual major gastroenterology meetings (American College of Gastroenterology and Digestive Disease Week) from the past 4 years. Data on study design; number, age, and sex of patients; duration of treatment period; primary efficacy variable; secondary efficacy variables; adverse events; and discontinuations because of adverse events were abstracted from eligible articles. Eligible studies were assessed using well-established recommendations and a preformatted standardized form. A scoring system, with scores ranging from 1 to 15, was used to individually and separately assess the methodologic quality of the studies. Results of this analysis indicate a general lack of methodologically high-quality clinical trials supporting the use of lactulose and PEG 3350 to treat patients with chronic constipation, but data support their use in acute, episodic constipation. Conversely, high-quality evidence for tegaserod and lubiprostone in patients with chronic constipation does exist, though conclusions regarding the role in therapy for lubiprostone are still in development. PMID:28325992
The CORSAGE Programme: Continuous Orbital Remote Sensing of Archipelagic Geochemical Effects
NASA Technical Reports Server (NTRS)
Acker, J. G.; Brown, C. W.; Hine, A. C.
1997-01-01
Current and pending oceanographic remote sensing technology allows the conceptualization of a programme designed to investigate ocean island interactions that could induce short-term nearshore fluxes of particulate organic carbon and biogenic calcium carbonate from pelagic island archipelagoes. These events will influence the geochemistry of adjacent waters, particularly the marine carbon system. Justification and design are provided for a study that would combine oceanographic satellite remote sensing (visible and infrared radiometry, altimetry and scatterometry) with shore-based facilities. A programme incorporating the methodology outlined here would seek to identify the mechanisms that cause such events, assess their geochemical significance, and provide both analytical and predictive capabilities for observations on greater temporal and spatial scales.
Goodman, L A; Corcoran, C; Turner, K; Yuan, N; Green, B L
1998-07-01
This article reviews the psychometric properties of the Stressful Life Events Screening Questionnaire (SLESQ), a recently developed trauma history screening measure, and discusses the complexities involved in assessing trauma exposure. There are relatively few general measures of exposure to a variety of types of traumatic events, and most of those that exist have not been subjected to rigorous psychometric evaluation. The SLESQ showed good test-retest reliability, with a median kappa of .73, adequate convergent validity (with a lengthier interview) with a median kappa of .64, and good discrimination between Criterion A and non-Criterion A events. The discussion addresses some of the challenges of assessing traumatic event exposure along the dimensions of defining traumatic events, assessment methodologies, reporting consistency, and incident validation.
Benier, Kathryn
2017-08-01
Many studies into the antecedents of hate crime in the neighborhood combine offense categories, meaning that it is unclear whether or not there are distinct contextual factors associated with violent and property hate offenses. This study uses rare events modeling to examine the household and neighborhood factors associated with violent and property offenses. Using the Australian Community Capacity Study, the study focuses on the neighborhood characteristics influencing self-reported violent and property hate crime for 4,396 residents in Brisbane. Findings demonstrate important differences between the offense types. Violence is predicted by household renting and non-English language, whereas property offenses are predicted by household non-English language, neighborhood median income, and change in non-English-speaking residents. In both offense types, neighborhood place attachment acts as a protective factor. These findings highlight the theoretical implications of combining distinct hate crime types for methodological reasons.
NASA Astrophysics Data System (ADS)
Orczykowski, Tomasz; Tiukało, Andrzej
2016-03-01
Land use is considered as a non-structural, ecologically beneficial flood protection measure. Forest as one of the land use types has many useful applications which can be observed in detail on www.nwrm.eu website project. It is scientifically proved that afforestation influences flood events with high probability of occurrence. However, it is still to be argued how to measure land use impact on the hydrological response of watershed and how it should be measured in an efficient and quantifiable way. Having the tool for such an impact measurement, we can build efficient land management strategies. It is difficult to observe the impact of land use on flood events in the field.Therefore, one of the possible solutions is to observe this impact indirectly by means of hydrological rainfall-runoff models as a proxy for the reality. Such experiments were conducted in the past. Our study aims to work on the viability assessment, methodology and tools that allow to observe this impact with use of selected hydrological models and readily available data in Poland. Our first reaserch site is located within headwaters of the Kamienna river watershed. This watershed has been affected by ecological disaster, which resulted in loss of 65% of forest coverage. Our proposed methodology is to observe this transformation and its effect on the watershed response to heavy precipitation and therefore change in the flood risk.
Raponi, Matteo; Damiani, Gianfranco; Vincenti, Sara; Wachocka, Malgorzata; Boninti, Federica; Bruno, Stefania; Quaranta, Gianluigi; Moscato, Umberto; Boccia, Stefania; Ficarra, Maria Giovanna; Specchia, Maria Lucia; Posteraro, Brunella; Berloco, Filippo; Celani, Fabrizio; Ricciardi, Walter; Laurenti, Patrizia
2014-01-01
The purpose of this research is to identify and formalize the Hospital Hygiene Service activities and products, evaluating them in a cost accounting management view. The ultimate aim, is to evaluate the financial adverse events prevention impact, in an Hospital Hygiene Service management. A three step methodology based on affinity grouping activities, was employed. This methodology led us to identify 4 action areas, with 23 related productive processes, and 86 available safety packages. Owing to this new methodology, we was able to implement a systematic evaluation of the furnished services.
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
NASA Astrophysics Data System (ADS)
Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros
2017-10-01
The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.
Traditional Chinese medicine for knee osteoarthritis: An overview of systematic review
Wang, Qing; Chen, Hao
2017-01-01
Background Traditional Chinese medicine (TCM) has been accepted as a complementary therapy for knee osteoarthritis. However, the efficacy and safety of the intervention were still conflicting and uncertain. Meanwhile, the quality of methodology and evidence in the field was unknown. Objective To summarize the characteristics and critically evaluate the quality of methodology, as well as the evidence of systematic reviews (SRs) on TCM for knee osteoarthritis. Methods Five electronic databases were searched from inception to April 2016. The methodological quality of the included studies was assessed by AMSTAR and ROBIS. The quality of the evidence was determined using the GRADE approach. Results Ten SRs were included. The conclusions suggest that TCM provides potential benefits for patients with knee osteoarthritis. These benefits include pain relief, functional improvement, and presence of few adverse events. Limitations of the methodological quality mainly included the lack of a-priori protocol or protocol registration and incomprehensive literature search. A list of excluded studies was also not provided. The overall quality of evidence in the SRs was poor, ranging from “very low” to “low,” mainly because of the serious risk of bias of original trials, inconsistencies, and imprecision in the outcomes. Conclusions TCM generally appears to be effective for knee osteoarthritis treatment. However, the evidence is not robust enough because of the methodological flaws in SRs. Hence, these conclusions on available SRs should be treated with caution for clinical practice. PMID:29267324
The Effects of Twitter Sentiment on Stock Price Returns.
Ranco, Gabriele; Aleksovski, Darko; Caldarelli, Guido; Grčar, Miha; Mozetič, Igor
2015-01-01
Social media are increasingly reflecting and influencing behavior of other complex systems. In this paper we investigate the relations between a well-known micro-blogging platform Twitter and financial markets. In particular, we consider, in a period of 15 months, the Twitter volume and sentiment about the 30 stock companies that form the Dow Jones Industrial Average (DJIA) index. We find a relatively low Pearson correlation and Granger causality between the corresponding time series over the entire time period. However, we find a significant dependence between the Twitter sentiment and abnormal returns during the peaks of Twitter volume. This is valid not only for the expected Twitter volume peaks (e.g., quarterly announcements), but also for peaks corresponding to less obvious events. We formalize the procedure by adapting the well-known "event study" from economics and finance to the analysis of Twitter data. The procedure allows to automatically identify events as Twitter volume peaks, to compute the prevailing sentiment (positive or negative) expressed in tweets at these peaks, and finally to apply the "event study" methodology to relate them to stock returns. We show that sentiment polarity of Twitter peaks implies the direction of cumulative abnormal returns. The amount of cumulative abnormal returns is relatively low (about 1-2%), but the dependence is statistically significant for several days after the events.
Cascadia Slow Earthquakes: Strategies for Time Independent Inversion of Displacement Fields
NASA Astrophysics Data System (ADS)
Szeliga, W. M.; Melbourne, T. I.; Miller, M. M.; Santillan, V. M.
2004-12-01
Continuous observations using Global Positioning System geodesy (CGPS) have revealed periodic slow or silent earthquakes along the Cascadia subduction zone with a spectrum of timing and periodicity. These creep events perturb time series of GPS observations and yield coherent displacement fields that relate to the extent and magnitude of fault displacement. In this study, time independent inversions of the surface displacement fields that accompany eight slow earthquakes characterize slip distributions along the plate interface for each event. The inversions employed in this study utilize Okada's elastic dislocation model and a non- negative least squares approach. Methodologies for optimizing the slip distribution smoothing parameter for a particular station distribution have also been investigated, significantly reducing the number of possible slip distributions and the range of estimates for total moment release for each event. The discretized slip distribution calculated for multiple creep events identifies areas of the Cascadia plate interface where slip persistently recurs. The current hypothesis, that slow earthquakes are modulated by forced fluid flow, leads to the possibility that some regions of the Cascadia plate interface may display fault patches preferentially exploited by fluid flow. Thus, the identification of regions of the plate interface that repeatedly slip during slow events may yield important information regarding the identification of these fluid pathways.
Risk and safety of pediatric sedation/anesthesia for procedures outside the operating room.
Cravero, Joseph P
2009-08-01
Sedation and anesthesia outside the operating room represents a rapidly growing field of practice that involves a number of different specialty providers including anesthesiology. The literature surrounding this work is found in a variety of journals - many outside anesthesiology. This review is intended to inform readers about the current status of risk and safety involving sedation/anesthesia for tests and minor procedures utilizing a wide range of sources. Two large database studies have helped to define the frequency and nature of adverse events in pediatric sedation/anesthesia practice from a multispecialty perspective. A number of papers describing respiratory and hemodynamic aspects of dexmedetomidine sedation have also been published. Finally, a number of studies relating to training sedation providers, reporting of sedation adverse events, sedation for vulnerable populations, and (in particular) ketamine sedation adverse respiratory events have also come to light. The latest publications continue to document a relatively low risk to pediatric sedation yet also warn us about the potential adverse events in this field. The results help to define competencies required to deliver pediatric sedation and make this practice even safer. Particularly interesting are new jargon and methodologies for defining adverse events and the use of new methods for training sedation providers.
Evaluation of a community arts installation event in support of public health.
Philipp, Robin; Gibbons, Nigel; Thorne, Pam; Wiltshire, Laura; Burrough, June; Easterby, John
2015-01-01
This study is set in the context of recent arts and health developments. It evaluates the worth for public health of a ten day community arts installation event held in Bristol, England, in support of new immigrants, refugees and asylum seekers. Action research methods were used by members of a creative writing group to elicit among 434 public visitors their free-text reflections on the project and/or their reactions to the event. Based on the three themes of the event, 'Homes', 'Histories' and 'Hope', three independent researchers coded the material for analysis. Participants addressed the theme of hope much more frequently than home or histories. Responses to all three themes were mostly positive. What they hoped was principally for opportunities for themselves and others to be able to enjoy life more and in non-material ways. In all, 45% of them expressed appreciation for the event helping to raise awareness and understanding of the roles of arts and culture in the community. Despite its methodological limitations, the study identified non-material ways individuals can be enabled to feel better supported in society and more positive in their outlooks. From the findings, to help strengthen social capital, community cohesion and constructive citizenship, the outline of a proposed educational tool is presented. © Royal Society for Public Health 2014.
Incorporation of lean methodology into pharmacy residency programs.
John, Natalie; Snider, Holly; Edgerton, Lisa; Whalin, Laurie
2017-03-15
The implementation of lean methodology into pharmacy residency programs at a community teaching hospital is described. New Hanover Regional Medical Center, a community teaching hospital in southeastern North Carolina, fully adopted a lean culture in 2010. Given the success of lean strategies organizationally, this methodology was used to assist with the evaluation and development of its pharmacy residency programs in 2014. Lean tools and activities have also been incorporated into residency requirements and rotation learning activities. The majority of lean events correspond to the required competency areas evaluating leadership and management, teaching, and education. These events have included participation in and facilitation of various lean problem-solving and communication tools. The application of the 4 rules of lean has resulted in enhanced management of the programs and provides a set of tools by which continual quality improvement can be ensured. Regular communication and direct involvement of all invested parties have been critical in developing and sustaining new improvements. In addition to program enhancements, lean methodology offers novel methods by which residents may be incorporated into leadership activities. The incorporation of lean methodology into pharmacy residency programs has translated into a variety of realized and potential benefits for the programs, the preceptors and residents, and the health system. Specific areas of growth have included quality-improvement processes, the expansion of leadership opportunities for residents, and improved communication among program directors, preceptors, and residents. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Preventing Harm in the ICU-Building a Culture of Safety and Engaging Patients and Families.
Thornton, Kevin C; Schwarz, Jennifer J; Gross, A Kendall; Anderson, Wendy G; Liu, Kathleen D; Romig, Mark C; Schell-Chaple, Hildy; Pronovost, Peter J; Sapirstein, Adam; Gropper, Michael A; Lipshutz, Angela K M
2017-09-01
Preventing harm remains a persistent challenge in the ICU despite evidence-based practices known to reduce the prevalence of adverse events. This review seeks to describe the critical role of safety culture and patient and family engagement in successful quality improvement initiatives in the ICU. We review the evidence supporting the impact of safety culture and provide practical guidance for those wishing to implement initiatives aimed at improving safety culture and more effectively integrate patients and families in such efforts. Literature review using PubMed including evaluation of key studies assessing large-scale quality improvement efforts in the ICU, impact of safety culture on patient outcomes, methodologies for quality improvement commonly used in healthcare, and patient and family engagement. Print and web-based resources from leading patient safety organizations were also searched. Our group completed a review of original studies, review articles, book chapters, and recommendations from leading patient safety organizations. Our group determined by consensus which resources would best inform this review. A strong safety culture is associated with reduced adverse events, lower mortality rates, and lower costs. Quality improvement efforts have been shown to be more effective and sustainable when paired with a strong safety culture. Different methodologies exist for quality improvement in the ICU; a thoughtful approach to implementation that engages frontline providers and administrative leadership is essential for success. Efforts to substantively include patients and families in the processes of quality improvement work in the ICU should be expanded. Efforts to establish a culture of safety and meaningfully engage patients and families should form the foundation for all safety interventions in the ICU. This review describes an approach that integrates components of several proven quality improvement methodologies to enhance safety culture in the ICU and highlights opportunities to include patients and families.
Tene, A; Tobin, B; Dyckmans, J; Ray, D; Black, K; Nieuwenhuis, M
2011-03-01
A thinning experiment stand at Avoca, Ballinvalley, on the east coast of the Republic of Ireland was used to test a developed methodology aimed at monitoring drought stress, based on the analysis of growth rings obtained by coring. The stand incorporated six plots representing three thinning regimes (light, moderate and heavy) and was planted in the spring of 1943 on a brown earth soil. Radial growth (early- and latewood) was measured for the purpose of this study. A multidisciplinary approach was used to assess historic tree response to climate: specifically, the application of statistical tools such as principal component and canonical correlation analysis to dendrochronology, stable isotopes, ring density proxy, blue reflectance and forest biometrics. Results showed that radial growth was a good proxy for monitoring changes to moisture deficit, while maximum density and blue reflectance were appropriate for assessing changes in accumulated temperature for the growing season. Rainfall also influenced radial growth changes but not significantly, and was a major factor in stable carbon and oxygen discrimination, mostly in the latewood formation phase. Stable oxygen isotope analysis was more accurate than radial growth analysis in drought detection, as it helped detect drought signals in both early- and latewood while radial growth analysis only detected the drought signal in earlywood. Many studies have shown that tree rings provide vital information for marking past climatic events. This work provides a methodology to better identify and understand how commonly measured tree proxies relate to environmental parameters, and can best be used to characterize and pinpoint drought events (variously described using parameters such as like moisture deficit, accumulated temperature, rainfall and potential evaporation).
Methodological and Ethical Issues in Pediatric Medication Safety Research.
Carpenter, Delesha; Gonzalez, Daniel; Retsch-Bogart, George; Sleath, Betsy; Wilfond, Benjamin
2017-09-01
In May 2016, the Eshelman School of Pharmacy at The University of North Carolina at Chapel Hill convened the PharmSci conference to address the topic of "methodological and ethical issues in pediatric medication safety research." A multidisciplinary group of experts representing a diverse array of perspectives, including those of the US Food and Drug Administration, children's hospitals, and academia, identified important considerations for pediatric medication safety research and opportunities to advance the field. This executive summary describes current challenges that clinicians and researchers encounter related to pediatric medication safety research and identifies innovative and ethically sound methodologies to address these challenges to improve children's health. This article addresses 5 areas: (1) pediatric drug development and drug trials; (2) conducting comparative effectiveness research in pediatric populations; (3) child and parent engagement on study teams; (4) improving communication with children and parents; and (5) assessing child-reported outcomes and adverse drug events. Copyright © 2017 by the American Academy of Pediatrics.
Lee, Ching-Chih; Ho, Hsu-Chueh; Su, Yu-Chieh; Chiu, Brian C-H; Su, Yung-Cheng; Lee, Yi-Da; Chou, Pesus
2012-01-01
Background Dizziness and vertigo symptoms are commonly seen in emergency room (ER). However, these patients are often discharged without a definite diagnosis. Conflicting data regarding the vascular event risk among the dizziness or vertigo patients have been reported. This study aims to determine the risk of developing stroke or cardiovascular events in ER patients discharged home with a diagnosis of dizziness or vertigo. Methodology A total of 25,757 subjects with at least one ER visit in 2004 were identified. Of those, 1,118 patients were discharged home with a diagnosis of vertigo or dizziness. A Cox proportional hazard model was performed to compare the three-year vascular event-free survival rates between the dizziness/vertigo patients and those without dizziness/vertigo after adjusting for confounding and risk factors. Results We identified 52 (4.7%) vascular events in patients with dizziness/vertigo and 454 (1.8%) vascular events in patients without dizziness/vertigo. ER patients discharged home with a diagnosis of vertigo or dizziness had 2-fold (95% confidence interval [CI], 1.35–2.96; p<0.001) higher risk of stroke or cardiovascular events after adjusting for patient characteristics, co-morbidities, urbanization level of residence, individual socio-economic status, and initially taking medications after the onset of dizziness or vertigo during the first year. Conclusions ER patients discharged home with a diagnosis of dizziness or vertigo were at a increased risk of developing subsequent vascular events than those without dizziness/vertigo after the onset of dizziness or vertigo. Further studies are warranted for developing better diagnostic and follow-up strategies in increased risk patients. PMID:22558272
Yilmaz, Arzu Akman; Ilce, Arzu; Can Cicek, Saadet; Yuzden, Ganime Esra; Yigit, Ummuhan
2016-04-01
Students' conceptualizations of nursing and their reasons for choosing the profession motivate them and affect their education, work performance and career plans. Nursing educators should support them to plan their careers consciously during their education. The study aimed to investigate the effect of career-planning event for nursing students on their conceptualizations of the nursing profession and their career plans. The study was as single-group experimental study using a pre-test and post-test. The career-planning event was held in the conference hall of the university involved in the current study, and was open to the all students of the nursing school. The sample of the study consisted of 105 students who participated in the "Nursing Career Symposium" held on 27 March 2015. Methods At the event, the importance of career planning and the opportunities of the nursing profession was presented. The data were collected using a questionnaire consisted of two sections including descriptive characteristics and the opinions of the students regarding their career plans and Perception of Nursing Profession Scale. The students completed the first section of the questionnaire before the career event began and the second section of the questionnaire and scale both before and after the event. The participants had positive conceptualizations of the profession. Following a career event, the participants' opinions of professional qualities and professional status as measured through the Perception of Nursing Profession Scale showed a significant increase, and that the event had made an important contribution to their career plans. In the light of these results, it is possible to suggest that such events have an important place during education in that they introduce the nursing profession, and they develop the students' positive thoughts regarding the profession in terms of both course content and teaching methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Effects of traditional Chinese patent medicine on essential hypertension: a systematic review.
Xiong, Xingjiang; Wang, Pengqian; Zhang, Yuqing; Li, Xiaoke
2015-02-01
Traditional Chinese patent medicine (TCPM) is widely used for essential hypertension (EH) in China. However, there is no critically appraised evidence, such as systematic reviews or meta-analyses, regarding the potential benefits and disadvantages of TCPM to justify their clinical use and recommendation. The aim of this review was to systematically evaluate and meta-analyze the effects of TCPM for EH. Seven databases, the Cochrane Library, PubMed, EMBASE, the China National Knowledge Infrastructure, the Chinese Scientific Journal Database, the Chinese Biomedical Literature Database, and the Wanfang Database, were searched from their inception to August 2014 for relevant studies that compared one TCPM plus antihypertensive drugs versus antihypertensive drugs alone. The methodological quality of the included trials was assessed using the Cochrane risk-of-bias tool. The primary outcome measures were mortality or progression to severe complications and adverse events. The secondary outcome measures were blood pressure (BP) and quality of life (QOL). Seventy-three trials, which included 8138 patients, on 17 TCPMs were included. In general, the methodological quality was low. Two trials evaluated the effects of TCPMs on mortality and the progression to severe complications after treatment, and no significant difference was identified compared with antihypertensive drugs alone. No severe adverse events were reported. Thirteen TCPMs used in complementary therapy significantly decreased systolic BP by 3.94 to 13.50 mmHg and diastolic BP by 2.28 to 11.25 mmHg. QOL was significantly improved by TCPM plus antihypertensive drugs compared with antihypertensive drugs alone. This systematic review provided the first classification of clinical evidence for the effectiveness of TCPM for EH. The usage of TCPMs for EH was supported by evidence of class level III. As a result of the methodological drawbacks of the included studies, more rigorously designed randomized controlled trials that focus on mortality and cardiovascular events during long-term follow-up are warranted before TCPM can be recommended for hypertensive patients. Two TCPMs, Song ling xue mai kang capsules and Yang xue qing nao granules, should be prioritized for further research.
Hirst, William; Phelps, Elizabeth A; Meksin, Robert; Vaidya, Chandan J; Johnson, Marcia K; Mitchell, Karen J; Buckner, Randy L; Budson, Andrew E; Gabrieli, John D E; Lustig, Cindy; Mather, Mara; Ochsner, Kevin N; Schacter, Daniel; Simons, Jon S; Lyle, Keith B; Cuc, Alexandru F; Olsson, Andreas
2015-06-01
Within a week of the attack of September 11, 2001, a consortium of researchers from across the United States distributed a survey asking about the circumstances in which respondents learned of the attack (their flashbulb memories) and the facts about the attack itself (their event memories). Follow-up surveys were distributed 11, 25, and 119 months after the attack. The study, therefore, examines retention of flashbulb memories and event memories at a substantially longer retention interval than any previous study using a test-retest methodology, allowing for the study of such memories over the long term. There was rapid forgetting of both flashbulb and event memories within the first year, but the forgetting curves leveled off after that, not significantly changing even after a 10-year delay. Despite the initial rapid forgetting, confidence remained high throughout the 10-year period. Five putative factors affecting flashbulb memory consistency and event memory accuracy were examined: (a) attention to media, (b) the amount of discussion, (c) residency, (d) personal loss and/or inconvenience, and (e) emotional intensity. After 10 years, none of these factors predicted flashbulb memory consistency; media attention and ensuing conversation predicted event memory accuracy. Inconsistent flashbulb memories were more likely to be repeated rather than corrected over the 10-year period; inaccurate event memories, however, were more likely to be corrected. The findings suggest that even traumatic memories and those implicated in a community's collective identity may be inconsistent over time and these inconsistencies can persist without the corrective force of external influences. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Foulser-Piggott, R.; Saito, K.; Spence, R.
2012-04-01
Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.
Neural network approach in multichannel auditory event-related potential analysis.
Wu, F Y; Slater, J D; Ramsay, R E
1994-04-01
Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.
Making historic loss data comparable over time and place
NASA Astrophysics Data System (ADS)
Eichner, Jan; Steuer, Markus; Löw, Petra
2017-04-01
When utilizing historic loss data for present day risk assessment, it is necessary to make the data comparable over time and place. To achieve this, the assessment of costs from natural hazard events requires consistent and homogeneous methodologies for loss estimation as well as a robust treatment of loss data to estimate and/or reduce distorting effects due to a temporal bias in the reporting of small-scale loss events. Here we introduce Munich Re's NatCatSERVICE loss database and present a novel methodology of peril-specific normalization of the historic losses (to account for socio-economic growth of assets over time), and we introduce a metric of severity classification (called CatClass) that allows for a global comparison of impact severity across countries of different stages of economic development.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
A Metastatistical Approach to Satellite Estimates of Extreme Rainfall Events
NASA Astrophysics Data System (ADS)
Zorzetto, E.; Marani, M.
2017-12-01
The estimation of the average recurrence interval of intense rainfall events is a central issue for both hydrologic modeling and engineering design. These estimates require the inference of the properties of the right tail of the statistical distribution of precipitation, a task often performed using the Generalized Extreme Value (GEV) distribution, estimated either from a samples of annual maxima (AM) or with a peaks over threshold (POT) approach. However, these approaches require long and homogeneous rainfall records, which often are not available, especially in the case of remote-sensed rainfall datasets. We use here, and tailor it to remotely-sensed rainfall estimates, an alternative approach, based on the metastatistical extreme value distribution (MEVD), which produces estimates of rainfall extreme values based on the probability distribution function (pdf) of all measured `ordinary' rainfall event. This methodology also accounts for the interannual variations observed in the pdf of daily rainfall by integrating over the sample space of its random parameters. We illustrate the application of this framework to the TRMM Multi-satellite Precipitation Analysis rainfall dataset, where MEVD optimally exploits the relatively short datasets of satellite-sensed rainfall, while taking full advantage of its high spatial resolution and quasi-global coverage. Accuracy of TRMM precipitation estimates and scale issues are here investigated for a case study located in the Little Washita watershed, Oklahoma, using a dense network of rain gauges for independent ground validation. The methodology contributes to our understanding of the risk of extreme rainfall events, as it allows i) an optimal use of the TRMM datasets in estimating the tail of the probability distribution of daily rainfall, and ii) a global mapping of daily rainfall extremes and distributional tail properties, bridging the existing gaps in rain gauges networks.
Dynamics of Metabolism and Decision Making During Alcohol Consumption: Modeling and Analysis.
Giraldo, Luis Felipe; Passino, Kevin M; Clapp, John D; Ruderman, Danielle
2017-11-01
Heavy alcohol consumption is considered an important public health issue in the United States as over 88 000 people die every year from alcohol-related causes. Research is being conducted to understand the etiology of alcohol consumption and to develop strategies to decrease high-risk consumption and its consequences, but there are still important gaps in determining the main factors that influence the consumption behaviors throughout the drinking event. There is a need for methodologies that allow us not only to identify such factors but also to have a comprehensive understanding of how they are connected and how they affect the dynamical evolution of a drinking event. In this paper, we use previous empirical findings from laboratory and field studies to build a mathematical model of the blood alcohol concentration dynamics in individuals that are in drinking events. We characterize these dynamics as the result of the interaction between a decision-making system and the metabolic process for alcohol. We provide a model of the metabolic process for arbitrary alcohol intake patterns and a characterization of the mechanisms that drive the decision-making process of a drinker during the drinking event. We use computational simulations and Lyapunov stability theory to analyze the effects of the parameters of the model on the blood alcohol concentration dynamics that are characterized. Also, we propose a methodology to inform the model using data collected in situ and to make estimations that provide additional information to the analysis. We show how this model allows us to analyze and predict previously observed behaviors, to design new approaches for the collection of data that improves the construction of the model, and help with the design of interventions.
Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa
2017-03-07
The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and the risk factors of these events. Modifiable risk factors are identified, owing to which, changes in said factors would have a bearing of the incidence of events.
Investigating accident causation through information network modelling.
Griffin, T G C; Young, M S; Stanton, N A
2010-02-01
Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.
Quinn, Patrick D.; Stappenbeck, Cynthia A.; Fromme, Kim
2013-01-01
Laboratory-based experimental research has demonstrated that the pharmacological effects of alcohol can increase aggressive responding. Given mixed findings and concerns regarding task validity, however, it remains uncertain whether this effect holds constant across men and women and whether variability in subjective alcohol intoxication contributes to alcohol-related aggression. In the present investigation, we used four years of event-level data in a sample of 1,775 college students (140,618 total observations) to provide a test of laboratory-derived findings on the link between alcohol and aggression in an alternative methodology. We found support for several such findings: 1) Within-person increases in alcohol intoxication, as assessed by estimated blood alcohol concentrations (eBACs), were associated with increases in the probability of aggression at the drinking-episode level; 2) This association was significantly stronger among men than among women; and 3) Within-person variability and between-persons individual differences in levels of subjective alcohol intoxication were associated with aggression over and beyond eBACs. Cross-methodological replication can reduce the impact of constraints specific to experimental studies on conclusions regarding alcohol’s relation with aggression. PMID:23421356
Transient Mass-loss Analysis of Solar Observations Using Stellar Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crosley, M. K.; Norman, C.; Osten, R. A.
Low-frequency dynamic spectra of radio bursts from nearby stars offer the best chance to directly detect the stellar signature of transient mass loss on low-mass stars. Crosley et al. (2016) proposes a multi-wavelength methodology to determine coronal mass ejection (CME) parameters, such as speed, mass, and kinetic energy. We test the validity and accuracy of the results derived from the methodology by using Geostationary Operational Environmental Satellite X-ray observations and Bruny Island Radio Spectrometer radio observations. These are analogous observations to those that would be found in the stellar studies. Derived results from these observations are compared to direct whitemore » light measurements of the Large Angle and Spectrometric Coronagraph. We find that, when a pre-event temperature can be determined, the accuracy of CME speeds are within a few hundred km s{sup −1}, and are reliable when specific criteria has been met. CME mass and kinetic energies are only useful in determining the approximate order of magnitude measurements when considering the large errors associated to them. These results will be directly applicable to the interpretation of any detected stellar events and the derivation of stellar CME properties.« less
ERPS to Monitor Non-conscious Mentation
NASA Technical Reports Server (NTRS)
Donchin, E.
1984-01-01
Event Related Brain Potentials (or ERPs) are extracted from the EEG that can be recorded between a pair of electrodes placed on a person's scalp. The EEG is recorded as a continual fluctuation in voltage. It is the results of the integration of the potential fields generated by a multitude of neuronal ensembles that are active as the brain goes about its business. Within this ongoing signal it is possible to distinguish voltage fluctuations that are triggered in neural structures by the occurrence of specific events. This activity, evoked as it is by an external event, is known as the Evoked, or Event Related, Potential. The ERPs provide a unique opportunity to monitor non-conscious mentation. The inferences that can be based on ERP data are described and the limits of these inferences are emphasized. This, however, will not be an exhaustive review of the use of ERPs in Engineering Psychology. The application, its scope, and its limitations will be illustrated by means of one example. This example is preceded by a brief technical introduction to the methodology used in the study of ERPs. The manner in which ERPs are used to study cognition is described.
ERIC Educational Resources Information Center
Park, Crystal L.
2010-01-01
Interest in meaning and meaning making in the context of stressful life events continues to grow, but research is hampered by conceptual and methodological limitations. Drawing on current theories, the author first presents an integrated model of meaning making. This model distinguishes between the constructs of global and situational meaning and…
Statistical Properties of SEE Rate Calculation in the Limits of Large and Small Event Counts
NASA Technical Reports Server (NTRS)
Ladbury, Ray
2007-01-01
This viewgraph presentation reviews the Statistical properties of Single Event Effects (SEE) rate calculations. The goal of SEE rate calculation is to bound the SEE rate, though the question is by how much. The presentation covers: (1) Understanding errors on SEE cross sections, (2) Methodology: Maximum Likelihood and confidence Contours, (3) Tests with Simulated data and (4) Applications.
Ebbeling, Laura G; Goralnick, Eric; Bivens, Matthew J; Femino, Meg; Berube, Claire G; Sears, Bryan; Sanchez, Leon D
2016-01-01
Disaster exercises often simulate rare, worst-case scenario events that range from mass casualty incidents to severe weather events. In actuality, situations such as information system downtimes and physical plant failures may affect hospital continuity of operations far more significantly. The objective of this study is to evaluate disaster drills at two academic and one community hospital to compare the frequency of planned drills versus real-world events that led to emergency management command center activation. Emergency management exercise and command center activation data from January 1, 2013 to October 1, 2015 were collected from a database. The activations and drills were categorized according to the nature of the event. Frequency of each type of event was compared to determine if the drills were representative of actual activations. From 2013 to 2015, there were a total of 136 command center activations and 126 drills at the three hospital sites. The most common reasons for command center activations included severe weather (25 percent, n = 34), maintenance failure (19.9 percent, n = 27), and planned mass gathering events (16.9 percent, n = 23). The most frequent drills were process tests (32.5 percent, n = 41), hazardous material-related events (22.2 percent, n = 28), and in-house fires (15.10 percent, n = 19). Further study of the reasons behind why hospitals activate emergency management plans may inform better preparedness drills. There is no clear methodology used among all hospitals to create drills and their descriptions are often vague. There is an opportunity to better design drills to address specific purposes and events.
Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.
Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi
2018-01-31
Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.
Wavelet maxima curves of surface latent heat flux associated with two recent Greek earthquakes
NASA Astrophysics Data System (ADS)
Cervone, G.; Kafatos, M.; Napoletani, D.; Singh, R. P.
2004-05-01
Multi sensor data available through remote sensing satellites provide information about changes in the state of the oceans, land and atmosphere. Recent studies have shown anomalous changes in oceans, land, atmospheric and ionospheric parameters prior to earthquakes events. This paper introduces an innovative data mining technique to identify precursory signals associated with earthquakes. The proposed methodology is a multi strategy approach which employs one dimensional wavelet transformations to identify singularities in the data, and an analysis of the continuity of the wavelet maxima in time and space to identify the singularities associated with earthquakes. The proposed methodology has been employed using Surface Latent Heat Flux (SLHF) data to study the earthquakes which occurred on 14 August 2003 and on 1 March 2004 in Greece. A single prominent SLHF anomaly has been found about two weeks prior to each of the earthquakes.
Anderson, Rachel J; Boland, Jennifer; Garner, Sarah R
2016-01-01
Overgeneral memory, where individuals exhibit difficulties in retrieving specific episodes from autobiographical memory, has been consistently linked with emotional disorders. However, the majority of this literature has relied upon a single methodology, in which participants respond to emotional cue words with explicit instructions to retrieve/simulate specific events. Through the use of sentence completion tasks the current studies explored whether overgenerality represents a habitual pattern of thinking that extends to how individuals naturally consider their personal past and future life story. In both studies, when compared with controls, dysphoric individuals evidenced overgeneral thinking style with respect to their personal past. However, overgeneral future thinking was only evident when the sentence stems included emotional words. These findings highlight the importance of investigating the overgenerality phenomenon using a variety of cueing techniques and results are discussed with reference to the previous literature exploring overgenerality and cognitive models of depression.
Golder, Su; Wright, Kath
2016-01-01
Background We performed a systematic review to assess whether we can quantify the underreporting of adverse events (AEs) in the published medical literature documenting the results of clinical trials as compared with other nonpublished sources, and whether we can measure the impact this underreporting has on systematic reviews of adverse events. Methods and Findings Studies were identified from 15 databases (including MEDLINE and Embase) and by handsearching, reference checking, internet searches, and contacting experts. The last database searches were conducted in July 2016. There were 28 methodological evaluations that met the inclusion criteria. Of these, 9 studies compared the proportion of trials reporting adverse events by publication status. The median percentage of published documents with adverse events information was 46% compared to 95% in the corresponding unpublished documents. There was a similar pattern with unmatched studies, for which 43% of published studies contained adverse events information compared to 83% of unpublished studies. A total of 11 studies compared the numbers of adverse events in matched published and unpublished documents. The percentage of adverse events that would have been missed had each analysis relied only on the published versions varied between 43% and 100%, with a median of 64%. Within these 11 studies, 24 comparisons of named adverse events such as death, suicide, or respiratory adverse events were undertaken. In 18 of the 24 comparisons, the number of named adverse events was higher in unpublished than published documents. Additionally, 2 other studies demonstrated that there are substantially more types of adverse events reported in matched unpublished than published documents. There were 20 meta-analyses that reported the odds ratios (ORs) and/or risk ratios (RRs) for adverse events with and without unpublished data. Inclusion of unpublished data increased the precision of the pooled estimates (narrower 95% confidence intervals) in 15 of the 20 pooled analyses, but did not markedly change the direction or statistical significance of the risk in most cases. The main limitations of this review are that the included case examples represent only a small number amongst thousands of meta-analyses of harms and that the included studies may suffer from publication bias, whereby substantial differences between published and unpublished data are more likely to be published. Conclusions There is strong evidence that much of the information on adverse events remains unpublished and that the number and range of adverse events is higher in unpublished than in published versions of the same study. The inclusion of unpublished data can also reduce the imprecision of pooled effect estimates during meta-analysis of adverse events. PMID:27649528
NASA Technical Reports Server (NTRS)
Binienda, Wieslaw K.; Sancaktar, Erol; Roberts, Gary D. (Technical Monitor)
2002-01-01
An effective design methodology was established for composite jet engine containment structures. The methodology included the development of the full and reduced size prototypes, and FEA models of the containment structure, experimental and numerical examination of the modes of failure clue to turbine blade out event, identification of materials and design candidates for future industrial applications, and design and building of prototypes for testing and evaluation purposes.
The Extreme Climate Index: a novel and multi-hazard index for extreme weather events.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2017-04-01
In this presentation we introduce the Extreme Climate Index (ECI): an objective, multi-hazard index capable of tracking changes in the frequency or magnitude of extreme weather events in African countries, thus indicating that a shift to a new climate regime is underway in a particular area. This index has been developed in the context of XCF (eXtreme Climate Facilities) project lead by ARC (African Risk Capacity, specialised agency of the African Union), and will be used in the payouts triggering mechanism of an insurance programme against risks related to the increase of frequency and magnitude of extreme weather events due to climate regimes' changes. The main hazards covered by ECI will be extreme dry, wet and heat events, with the possibility of adding region-specific risk events such as tropical cyclones for the most vulnerable areas. It will be based on data coming from consistent, sufficiently long, high quality historical records and will be standardized across broad geographical regions, so that extreme events occurring under different climatic regimes in Africa can be comparable. The first step to construct such an index is to define single hazard indicators. In this first study we focused on extreme dry/wet and heat events, using for their description respectively the well-known SPI (Standardized Precipitation Index) and an index developed by us, called SHI (Standardized Heat-waves Index). The second step consists in the development of a computational strategy to combine these, and possibly other indices, so that the ECI can describe, by means of a single indicator, different types of climatic extremes. According to the methodology proposed in this paper, the ECI is defined by two statistical components: the ECI intensity, which indicates whether an event is extreme or not; the angular component, which represent the contribution of each hazard to the overall intensity of the index. The ECI can thus be used to identify "extremes" after defining a suitable threshold above which the events can be held as extremes. In this presentation, after describing the methodology we used for the construction of the ECI, we present results obtained on different African regions, using NCEP Reanalysis dataset for air temperature at sig995 level and CHIRP dataset for precipitations. Particular attention will be devoted to 2015/2016 Malawi drought, which received some media attention due to the failure of the risk assessment model used to trigger due payouts: it will be shown how, on the contrary, combination of hydrological and temperature data used in ECI succeed in evaluating the extremeness of this event.
Adaptive Sampling using Support Vector Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Mandelli; C. Smith
2012-11-01
Reliability/safety analysis of stochastic dynamic systems (e.g., nuclear power plants, airplanes, chemical plants) is currently performed through a combination of Event-Tress and Fault-Trees. However, these conventional methods suffer from certain drawbacks: • Timing of events is not explicitly modeled • Ordering of events is preset by the analyst • The modeling of complex accident scenarios is driven by expert-judgment For these reasons, there is currently an increasing interest into the development of dynamic PRA methodologies since they can be used to address the deficiencies of conventional methods listed above.
Onasanya, Oluwadamilola; Iyer, Geetha; Lucas, Eleanor; Lin, Dora; Singh, Sonal; Alexander, G Caleb
2016-11-01
Given the conflicting evidence regarding the association between exogenous testosterone and cardiovascular events, we systematically assessed published systematic reviews for evidence of the association between exogenous testosterone and cardiovascular events. We searched PubMed, MEDLINE, Embase, Cochrane Collaboration Clinical Trials, ClinicalTrials.gov, and the US Food and Drug Administration website for systematic reviews of randomised controlled trials published up to July 19, 2016. Two independent reviewers screened 954 full texts from 29 335 abstracts to identify systematic reviews of randomised controlled trials in which the cardiovascular effects of exogenous testosterone on men aged 18 years or older were examined. We extracted data for study characteristics, analytic methods, and key findings, and applied the AMSTAR (A Measurement Tool to Assess Systematic Reviews) checklist to assess methodological quality of each review. Our primary outcome measure was the direction and magnitude of association between exogenous testosterone and cardiovascular events. We identified seven reviews and meta-analyses, which had substantial clinical heterogeneity, differing statistical methods, and variable methodological quality and quality of data abstraction. AMSTAR scores ranged from 3 to 9 out of 11. Six systematic reviews that each included a meta-analysis showed no significant association between exogenous testosterone and cardiovascular events, with summary estimates ranging from 1·07 to 1·82 and imprecise confidence intervals. Two of these six meta-analyses showed increased risk in subgroup analyses of oral testosterone and men aged 65 years or older during their first treatment year. One meta-analysis showed a significant association between exogenous testosterone and cardiovascular events, in men aged 18 years or older generally, with a summary estimate of 1·54 (95% CI 1·09-2·18). Our optimal information size analysis showed that any randomised controlled trial aiming to detect a true difference in cardiovascular risk between treatment groups receiving exogenous testosterone and their controls (with a two-sided p value of 0·05 and a power of 80%) would require at least 17 664 participants in each trial group. Therefore, given the challenge of adequately powering clinical trials for rare outcomes, rigorous observational studies are needed to clarify the association between testosterone-replacement therapy and major adverse cardiovascular outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Leighton, Caroline; Botto, Alberto; Silva, Jaime R; Jiménez, Juan Pablo; Luyten, Patrick
2017-01-01
Research on the potential role of gene-environment interactions (GxE) in explaining vulnerability to psychopathology in humans has witnessed a shift from a diathesis-stress perspective to differential susceptibility approaches. This paper critically reviews methodological issues and trends in this body of research. Databases were screened for studies of GxE in the prediction of personality traits, behavior, and mental health disorders in humans published between January 2002 and January 2015. In total, 315 papers were included. Results showed that 34 candidate genes have been included in GxE studies. Independent of the type of environment studied (early or recent life events, positive or negative environments), about 67-83% of studies have reported significant GxE interactions, which is consistent with a social susceptibility model. The percentage of positive results does not seem to differ depending on the gene studied, although publication bias might be involved. However, the number of positive findings differs depending on the population studied (i.e., young adults vs. older adults). Methodological considerations limit the ability to draw strong conclusions, particularly as almost 90% ( n = 283/315) of published papers are based on samples from North America and Europe, and about 70% of published studies (219/315) are based on samples that were also used in other reports. At the same time, there are clear indications of methodological improvements over time, as is shown by a significant increase in longitudinal and experimental studies as well as in improved minimum genotyping. Recommendations for future research, such as minimum quality assessment of genes and environmental factors, specifying theoretical models guiding the study, and taking into account of cultural, ethnic, and lifetime perspectives, are formulated.
Attribution of extreme weather and climate-related events.
Stott, Peter A; Christidis, Nikolaos; Otto, Friederike E L; Sun, Ying; Vanderlinden, Jean-Paul; van Oldenborgh, Geert Jan; Vautard, Robert; von Storch, Hans; Walton, Peter; Yiou, Pascal; Zwiers, Francis W
2016-01-01
Extreme weather and climate-related events occur in a particular place, by definition, infrequently. It is therefore challenging to detect systematic changes in their occurrence given the relative shortness of observational records. However, there is a clear interest from outside the climate science community in the extent to which recent damaging extreme events can be linked to human-induced climate change or natural climate variability. Event attribution studies seek to determine to what extent anthropogenic climate change has altered the probability or magnitude of particular events. They have shown clear evidence for human influence having increased the probability of many extremely warm seasonal temperatures and reduced the probability of extremely cold seasonal temperatures in many parts of the world. The evidence for human influence on the probability of extreme precipitation events, droughts, and storms is more mixed. Although the science of event attribution has developed rapidly in recent years, geographical coverage of events remains patchy and based on the interests and capabilities of individual research groups. The development of operational event attribution would allow a more timely and methodical production of attribution assessments than currently obtained on an ad hoc basis. For event attribution assessments to be most useful, remaining scientific uncertainties need to be robustly assessed and the results clearly communicated. This requires the continuing development of methodologies to assess the reliability of event attribution results and further work to understand the potential utility of event attribution for stakeholder groups and decision makers. WIREs Clim Change 2016, 7:23-41. doi: 10.1002/wcc.380 For further resources related to this article, please visit the WIREs website.
Hempel, Susanne; Maggard-Gibbons, Melinda; Nguyen, David K; Dawes, Aaron J; Miake-Lye, Isomi; Beroes, Jessica M; Booth, Marika J; Miles, Jeremy N V; Shanman, Roberta; Shekelle, Paul G
2015-08-01
Serious, preventable surgical events, termed never events, continue to occur despite considerable patient safety efforts. To examine the incidence and root causes of and interventions to prevent wrong-site surgery, retained surgical items, and surgical fires in the era after the implementation of the Universal Protocol in 2004. We searched 9 electronic databases for entries from 2004 through June 30, 2014, screened references, and consulted experts. Two independent reviewers identified relevant publications in June 2014. One reviewer used a standardized form to extract data and a second reviewer checked the data. Strength of evidence was established by the review team. Data extraction was completed in January 2015. Incidence of wrong-site surgery, retained surgical items, and surgical fires. We found 138 empirical studies that met our inclusion criteria. Incidence estimates for wrong-site surgery in US settings varied by data source and procedure (median estimate, 0.09 events per 10,000 surgical procedures). The median estimate for retained surgical items was 1.32 events per 10,000 procedures, but estimates varied by item and procedure. The per-procedure surgical fire incidence is unknown. A frequently reported root cause was inadequate communication. Methodologic challenges associated with investigating changes in rare events limit the conclusions of 78 intervention evaluations. Limited evidence supported the Universal Protocol (5 studies), education (4 studies), and team training (4 studies) interventions to prevent wrong-site surgery. Limited evidence exists to prevent retained surgical items by using data-matrix-coded sponge-counting systems (5 pertinent studies). Evidence for preventing surgical fires was insufficient, and intervention effects were not estimable. Current estimates for wrong-site surgery and retained surgical items are 1 event per 100,000 and 1 event per 10,000 procedures, respectively, but the precision is uncertain, and the per-procedure prevalence of surgical fires is not known. Root-cause analyses suggest the need for improved communication. Despite promising approaches and global Universal Protocol evaluations, empirical evidence for interventions is limited.
Cross-situational statistical word learning in young children.
Suanda, Sumarga H; Mugwanya, Nassali; Namy, Laura L
2014-10-01
Recent empirical work has highlighted the potential role of cross-situational statistical word learning in children's early vocabulary development. In the current study, we tested 5- to 7-year-old children's cross-situational learning by presenting children with a series of ambiguous naming events containing multiple words and multiple referents. Children rapidly learned word-to-object mappings by attending to the co-occurrence regularities across these ambiguous naming events. The current study begins to address the mechanisms underlying children's learning by demonstrating that the diversity of learning contexts affects performance. The implications of the current findings for the role of cross-situational word learning at different points in development are discussed along with the methodological implications of employing school-aged children to test hypotheses regarding the mechanisms supporting early word learning. Copyright © 2014 Elsevier Inc. All rights reserved.
Zhang, Juan; Meng, Yaxuan; McBride, Catherine; Fan, Xitao; Yuan, Zhen
2018-01-01
The present study investigated the impact of Chinese dialects on McGurk effect using behavioral and event-related potential (ERP) methodologies. Specifically, intra-language comparison of McGurk effect was conducted between Mandarin and Cantonese speakers. The behavioral results showed that Cantonese speakers exhibited a stronger McGurk effect in audiovisual speech perception compared to Mandarin speakers, although both groups performed equally in the auditory and visual conditions. ERP results revealed that Cantonese speakers were more sensitive to visual cues than Mandarin speakers, though this was not the case for the auditory cues. Taken together, the current findings suggest that the McGurk effect generated by Chinese speakers is mainly influenced by segmental phonology during audiovisual speech integration.
Zhang, Juan; Meng, Yaxuan; McBride, Catherine; Fan, Xitao; Yuan, Zhen
2018-01-01
The present study investigated the impact of Chinese dialects on McGurk effect using behavioral and event-related potential (ERP) methodologies. Specifically, intra-language comparison of McGurk effect was conducted between Mandarin and Cantonese speakers. The behavioral results showed that Cantonese speakers exhibited a stronger McGurk effect in audiovisual speech perception compared to Mandarin speakers, although both groups performed equally in the auditory and visual conditions. ERP results revealed that Cantonese speakers were more sensitive to visual cues than Mandarin speakers, though this was not the case for the auditory cues. Taken together, the current findings suggest that the McGurk effect generated by Chinese speakers is mainly influenced by segmental phonology during audiovisual speech integration. PMID:29780312
Bijou, Sidney W.; Peterson, Robert F.; Ault, Marion H.
1968-01-01
It is the thesis of this paper that data from descriptive and experimental field studies can be interrelated at the level of data and empirical concepts if both sets are derived from frequency-of-occurrence measures. The methodology proposed for a descriptive field study is predicated on three assumptions: (1) The primary data of psychology are the observable interactions of a biological organism and environmental events, past and present. (2) Theoretical concepts and laws are derived from empirical concepts and laws, which in turn are derived from the raw data. (3) Descriptive field studies describe interactions between behavioral and environmental events; experimental field studies provide information on their functional relationships. The ingredients of a descriptive field investigation using frequency measures consist of: (1) specifying in objective terms the situation in which the study is conducted, (2) defining and recording behavioral and environmental events in observable terms, and (3) measuring observer reliability. Field descriptive studies following the procedures suggested here would reveal interesting new relationships in the usual ecological settings and would also provide provocative cues for experimental studies. On the other hand, field-experimental studies using frequency measures would probably yield findings that would suggest the need for describing new interactions in specific natural situations. PMID:16795175
Tan, Francisca M; Caballero-Gaudes, César; Mullinger, Karen J; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L; Francis, Susan T; Gowland, Penny A
2017-11-01
Most functional MRI (fMRI) studies map task-driven brain activity using a block or event-related paradigm. Sparse paradigm free mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information, but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of activation likelihood estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the sensorimotor network (SMN) to six motor functions (left/right fingers, left/right toes, swallowing, and eye blinks). We validated the framework using simultaneous electromyography (EMG)-fMRI experiments and motor tasks with short and long duration, and random interstimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events were 77 ± 13% and 74 ± 16%, respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55% and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this article discusses methodological implications and improvements to increase the decoding performance. Hum Brain Mapp 38:5778-5794, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Tan, Francisca M.; Caballero-Gaudes, César; Mullinger, Karen J.; Cho, Siu-Yeung; Zhang, Yaping; Dryden, Ian L.; Francis, Susan T.; Gowland, Penny A.
2017-01-01
Most fMRI studies map task-driven brain activity using a block or event-related paradigm. Sparse Paradigm Free Mapping (SPFM) can detect the onset and spatial distribution of BOLD events in the brain without prior timing information; but relating the detected events to brain function remains a challenge. In this study, we developed a decoding method for SPFM using a coordinate-based meta-analysis method of Activation Likelihood Estimation (ALE). We defined meta-maps of statistically significant ALE values that correspond to types of events and calculated a summation overlap between the normalized meta-maps and SPFM maps. As a proof of concept, this framework was applied to relate SPFM-detected events in the Sensorimotor Network (SMN) to six motor function (left/right fingers, left/right toes, swallowing and eye blinks). We validated the framework using simultaneous Electromyography-fMRI experiments and motor tasks with short and long duration, and random inter-stimulus interval. The decoding scores were considerably lower for eye movements relative to other movement types tested. The average successful rate for short and long motor events was 77 ± 13% and 74 ± 16% respectively, excluding eye movements. We found good agreement between the decoding results and EMG for most events and subjects, with a range in sensitivity between 55 and 100%, excluding eye movements. The proposed method was then used to classify the movement types of spontaneous single-trial events in the SMN during resting state, which produced an average successful rate of 22 ± 12%. Finally, this paper discusses methodological implications and improvements to increase the decoding performance. PMID:28815863
Simulating Impacts of Disruptions to Liquid Fuels Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Michael; Corbet, Thomas F.; Baker, Arnold B.
This report presents a methodology for estimating the impacts of events that damage or disrupt liquid fuels infrastructure. The impact of a disruption depends on which components of the infrastructure are damaged, the time required for repairs, and the position of the disrupted components in the fuels supply network. Impacts are estimated for seven stressing events in regions of the United States, which were selected to represent a range of disruption types. For most of these events the analysis is carried out using the National Transportation Fuels Model (NTFM) to simulate the system-level liquid fuels sector response. Results are presentedmore » for each event, and a brief cross comparison of event simulation results is provided.« less
Fair, Damien A.; Choi, Alexander H.; Dosenbach, Yannic B.L.; Coalson, Rebecca S.; Miezin, Francis M.; Petersen, Steven E.; Schlaggar, Bradley L.
2009-01-01
Children with congenital left hemisphere damage due to perinatal stroke are capable of acquiring relatively normal language functions despite experiencing a cortical insult that in adults often leads to devastating lifetime disabilities. Although this observed phenomenon accepted, its neurobiological mechanisms are not well characterized. In this paper we examined the functional neuroanatomy of lexical processing in 13 children/adolescents with perinatal left hemispheric damage. In contrast to many previous perinatal infarct fMRI studies, we use an event-related design, which allowed us to isolate trial related activity and examine correct and error trials separately. Using both group and single subject analysis techniques we attempt to address several methodological factors that may contribute to some discrepancies in the perinatal lesion literature. These methodological factors include making direct statistical comparisons, using common stereotactic space, using both single-subject and group analyses, and accounting for performance differences. Our group analysis, investigating correct trial related activity (separately from error trials), showed very few statistical differences in the non-involved right hemisphere between patients and performance matched controls. The single subject analysis revealed atypical regional activation patterns in several patients; however, the location of these regions identified in individual patients often varied across subjects. These results are consistent with the idea that alternative functional organization of trial-related activity after left hemisphere lesions is in large part unique to the individual. In addition, reported differences between results obtained with event-related designs and blocked designs may suggest diverging organizing principles for sustained and trial-related activity after early childhood brain injuries. PMID:19819000
Fair, Damien A; Choi, Alexander H; Dosenbach, Yannic B L; Coalson, Rebecca S; Miezin, Francis M; Petersen, Steven E; Schlaggar, Bradley L
2010-08-01
Children with congenital left hemisphere damage due to perinatal stroke are capable of acquiring relatively normal language functions despite experiencing a cortical insult that in adults often leads to devastating lifetime disabilities. Although this observed phenomenon is accepted, its neurobiological mechanisms are not well characterized. In this paper we examined the functional neuroanatomy of lexical processing in 13 children/adolescents with perinatal left hemispheric damage. In contrast to many previous perinatal infarct fMRI studies, we used an event-related design, which allowed us to isolate trial-related activity and examine correct and error trials separately. Using both group and single subject analysis techniques we attempt to address several methodological factors that may contribute to some discrepancies in the perinatal lesion literature. These methodological factors include making direct statistical comparisons, using common stereotactic space, using both single subject and group analyses, and accounting for performance differences. Our group analysis, investigating correct trial-related activity (separately from error trials), showed very few statistical differences in the non-involved right hemisphere between patients and performance matched controls. The single subject analysis revealed atypical regional activation patterns in several patients; however, the location of these regions identified in individual patients often varied across subjects. These results are consistent with the idea that alternative functional organization of trial-related activity after left hemisphere lesions is in large part unique to the individual. In addition, reported differences between results obtained with event-related designs and blocked designs may suggest diverging organizing principles for sustained and trial-related activity after early childhood brain injuries. 2009 Elsevier Inc. All rights reserved.
Espié, Stéphane; Boubezoul, Abderrahmane; Aupetit, Samuel; Bouaziz, Samir
2013-09-01
Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. Copyright © 2013 Elsevier Ltd. All rights reserved.
Data mining of atmospheric parameters associated with coastal earthquakes
NASA Astrophysics Data System (ADS)
Cervone, Guido
Earthquakes are natural hazards that pose a serious threat to society and the environment. A single earthquake can claim thousands of lives, cause damages for billions of dollars, destroy natural landmarks and render large territories uninhabitable. Studying earthquakes and the processes that govern their occurrence, is of fundamental importance to protect lives, properties and the environment. Recent studies have shown that anomalous changes in land, ocean and atmospheric parameters occur prior to earthquakes. The present dissertation introduces an innovative methodology and its implementation to identify anomalous changes in atmospheric parameters associated with large coastal earthquakes. Possible geophysical mechanisms are discussed in view of the close interaction between the lithosphere, the hydrosphere and the atmosphere. The proposed methodology is a multi strategy data mining approach which combines wavelet transformations, evolutionary algorithms, and statistical analysis of atmospheric data to analyze possible precursory signals. One dimensional wavelet transformations and statistical tests are employed to identify significant singularities in the data, which may correspond to anomalous peaks due to the earthquake preparatory processes. Evolutionary algorithms and other localized search strategies are used to analyze the spatial and temporal continuity of the anomalies detected over a large area (about 2000 km2), to discriminate signals that are most likely associated with earthquakes from those due to other, mostly atmospheric, phenomena. Only statistically significant singularities occurring within a very short time of each other, and which tract a rigorous geometrical path related to the geological properties of the epicentral area, are considered to be associated with a seismic event. A program called CQuake was developed to implement and validate the proposed methodology. CQuake is a fully automated, real time semi-operational system, developed to study precursory signals associated with earthquakes. CQuake can be used for the retrospective analysis of past earthquakes, and for detecting early warning information about impending events. Using CQuake more than 300 earthquakes have been analyzed. In the case of coastal earthquakes with magnitude larger than 5.0, prominent anomalies are found up to two weeks prior to the main event. In case of earthquakes occurring away from the coast, no strong anomaly is detected. The identified anomalies provide a potentially reliable mean to mitigate earthquake risks in the future, and can be used to develop a fully operational forecasting system.
Total Risk Integrated Methodology (TRIM) - TRIM.Expo
The Exposure Event module of TRIM (TRIM.Expo), similar to most human exposure models, provides an analysis of the relationships between various chemical concentrations in the environment and exposure levels of humans.
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)
NASA Astrophysics Data System (ADS)
Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.
2016-10-01
Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.
In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less
Hwang, Thomas J.
2013-01-01
Background For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Methods Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. Results We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: –2.3, 13.4%; P = 0.02) for positive events and –2.0% (95% CI: –9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: –3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were –1.7% (95% CI: –9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. Conclusions The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than abnormal returns due to positive events, suggesting asymmetric market reactions. PMID:23951273
Chang, Joshua C; Leung, Mark; Gokozan, Hamza Numan; Gygli, Patrick Edwin; Catacutan, Fay Patsy; Czeisler, Catherine; Otero, José Javier
2015-03-01
Late embryonic and postnatal cerebellar folial surface area expansion promotes cerebellar cortical cytoarchitectural lamination. We developed a streamlined sampling scheme to generate unbiased estimates of murine cerebellar surface area and volume using stereologic principles. We demonstrate that, during the proliferative phase of the external granular layer (EGL) and folial surface area expansion, EGL thickness does not change and thus is a topological proxy for progenitor self-renewal. The topological constraints indicate that, during proliferative phases, migration out of the EGL is balanced by self-renewal. Progenitor self-renewal must, therefore, include mitotic events yielding 2 cells in the same layer to increase surface area (β events) and mitotic events yielding 2 cells, with 1 cell in a superficial layer and 1 cell in a deeper layer (α events). As the cerebellum grows, therefore, β events lie upstream of α events. Using a mathematical model constrained by the measurements of volume and surface area, we could quantify intermitotic times for β events on a per-cell basis in postnatal mouse cerebellum. Furthermore, we found that loss of CCNA2, which decreases EGL proliferation and secondarily induces cerebellar cortical dyslamination, shows preserved α-type events. Thus, CCNA2-null cerebellar granule progenitor cells are capable of self-renewal of the EGL stem cell niche; this is concordant with prior findings of extensive apoptosis in CCNA2-null mice. Similar methodologies may provide another layer of depth to the interpretation of results from stereologic studies.
2010-09-01
MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location
Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul
2011-01-01
Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238
Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur
2010-01-01
A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.
DeMayo, Marilena M; Song, Yun Ju C; Hickie, Ian B; Guastella, Adam J
2017-10-01
In this article, we conduct a comprehensive review of existing evidence for the safety and therapeutic potential of intranasal oxytocin in pediatric populations. Unique considerations for dosing and delivery of oxytocin to the nasal passageway in pediatric populations and methods to promote adherence are reviewed. Intranasal oxytocin has been administered to 261 children in three open-label studies and eight randomized controlled trials. To date, the only published results in pediatric populations have focused on autism spectrum disorder (ASD) and Prader-Willi syndrome (PWS). Results regarding efficacy for improving social impairment in ASD are equivocal, partially due to mixed methodological designs, dosing regimens, and outcome measures. At present, there is no randomized controlled evidence that oxytocin provides benefit to individuals with PWS. There is no clear evidence of a link between oxytocin administration and any specific adverse event. Adverse events have been assessed using medical interviews, open reports, checklists, and physiological assessments. Adverse events reports have been largely classified as mild (n = 93), with few moderate (n = 9) or severe (n = 3) events reported. There were 35 additional adverse events reported, without severity ratings. Severe events, hyperactivity and irritability, occurred at first administration in both placebo and oxytocin groups, and subsided subsequent to discontinuation. We note that adverse event monitoring is inconsistent and often lacking, and reporting of its relationship to the study drug is poor. Only one study reported adherence data to suggest high adherence. Recommendations are then provided for the delivery of nasal sprays to the nasal passageway, monitoring, and reporting of efficacy, safety, and adherence for oxytocin nasal spray trials in pediatric populations.
How Safe Are Common Analgesics for the Treatment of Acute Pain for Children? A Systematic Review.
Hartling, Lisa; Ali, Samina; Dryden, Donna M; Chordiya, Pritam; Johnson, David W; Plint, Amy C; Stang, Antonia; McGrath, Patrick J; Drendel, Amy L
2016-01-01
Background . Fear of adverse events and occurrence of side effects are commonly cited by families and physicians as obstructive to appropriate use of pain medication in children. We examined evidence comparing the safety profiles of three groups of oral medications, acetaminophen, nonsteroidal anti-inflammatory drugs, and opioids, to manage acute nonsurgical pain in children (<18 years) treated in ambulatory settings. Methods . A comprehensive search was performed to July 2015, including review of national data registries. Two reviewers screened articles for inclusion, assessed methodological quality, and extracted data. Risks (incidence rates) were pooled using a random effects model. Results . Forty-four studies were included; 23 reported on adverse events. Based on limited current evidence, acetaminophen, ibuprofen, and opioids have similar nausea and vomiting profiles. Opioids have the greatest risk of central nervous system adverse events. Dual therapy with a nonopioid/opioid combination resulted in a lower risk of adverse events than opioids alone. Conclusions . Ibuprofen and acetaminophen have similar reported adverse effects and notably less adverse events than opioids. Dual therapy with a nonopioid/opioid combination confers a protective effect for adverse events over opioids alone. This research highlights challenges in assessing medication safety, including lack of more detailed information in registry data, and inconsistent reporting in trials.
1991-03-20
Suite 1212 California Institute of Technology Resron, VA 22091 Pasadena, CA 91125 Mr. William J. Best Prof. F. A. Dahlen 907 Westwood Drive...P.O. Box 1620 La Jolla, CA 92038-1620 2 Prof. William Menke Prof. Charles G. Sammis Lamont-Doherty Geological Observatory Center for Earth Sciences...Cruz, CA 95064 3, Prof. Terry C. Wallace Department of Geosciences Building #77 University of Arizona Tucson, AZ 85721 Dr. William Wortman Mission
76 FR 45804 - Agency Information Collection Request; 60-Day Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... an algorithm that enables reliable prediction of a certain event. A responder could submit the correct algorithm, but without the methodology, the evaluation process could not be adequately performed...
Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P
Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Critical incident technique: a user's guide for nurse researchers.
Schluter, Jessica; Seaton, Philippa; Chaboyer, Wendy
2008-01-01
This paper is a description of the development and processes of the critical incident technique and its applicability to nursing research, using a recently-conducted study of the Australian nursing workforce as an exemplar. Issues are raised for consideration prior to the technique being put into practice. Since 1954, the critical incident technique has been used to study people's activities in a variety of professions. This five-step technique can be modified for specific settings and research questions. The fruitfulness of a study using the technique relies on gaining three important pieces of information. First, participants' complete and rich descriptions of the situation or event to be explored; secondly, the specific actions of the person/s involved in the event to aid understanding of why certain decisions were made; thirdly, the outcome of the event, to ascertain the effectiveness of the behaviour. As in other qualitative methodologies, an inductive analysis process can be used with the critical incident technique. Rich contextual information can be obtained using this technique. It generates information and uncovers tacit knowledge through assisting participants to describe their thought processes and actions during the event. Use of probing questions that determine how participants take part in certain events, or act in the ways they do, greatly enhances the outcome. A full interpretation of the event can only occur when all its aspects are provided. The critical incident technique is a practical method that allows researchers to understand complexities of the nursing role and function, and the interactions between nurses and other clinicians.
Methodology for Designing Operational Banking Risks Monitoring System
NASA Astrophysics Data System (ADS)
Kostjunina, T. N.
2018-05-01
The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.
Automated flood extent identification using WorldView imagery for the insurance industry
NASA Astrophysics Data System (ADS)
Geller, Christina
2017-10-01
Flooding is the most common and costly natural disaster around the world, causing the loss of human life and billions in economic and insured losses each year. In 2016, pluvial and fluvial floods caused an estimated 5.69 billion USD in losses worldwide with the most severe events occurring in Germany, France, China, and the United States. While catastrophe modeling has begun to help bridge the knowledge gap about the risk of fluvial flooding, understanding the extent of a flood - pluvial and fluvial - in near real-time allows insurance companies around the world to quantify the loss of property that their clients face during a flooding event and proactively respond. To develop this real-time, global analysis of flooded areas and the associated losses, a new methodology utilizing optical multi-spectral imagery from DigitalGlobe (DGI) WorldView satellite suite is proposed for the extraction of pluvial and fluvial flood extents. This methodology involves identifying flooded areas visible to the sensor, filling in the gaps left by the built environment (i.e. buildings, trees) with a nearest neighbor calculation, and comparing the footprint against an Industry Exposure Database (IE) to calculate a loss estimate. Full-automation of the methodology allows production of flood extents and associated losses anywhere around the world as required. The methodology has been tested and proven effective for the 2016 flood in Louisiana, USA.
Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis
2014-11-01
Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Disease management research using event graphs.
Allore, H G; Schruben, L W
2000-08-01
Event Graphs, conditional representations of stochastic relationships between discrete events, simulate disease dynamics. In this paper, we demonstrate how Event Graphs, at an appropriate abstraction level, also extend and organize scientific knowledge about diseases. They can identify promising treatment strategies and directions for further research and provide enough detail for testing combinations of new medicines and interventions. Event Graphs can be enriched to incorporate and validate data and test new theories to reflect an expanding dynamic scientific knowledge base and establish performance criteria for the economic viability of new treatments. To illustrate, an Event Graph is developed for mastitis, a costly dairy cattle disease, for which extensive scientific literature exists. With only a modest amount of imagination, the methodology presented here can be seen to apply modeling to any disease, human, plant, or animal. The Event Graph simulation presented here is currently being used in research and in a new veterinary epidemiology course. Copyright 2000 Academic Press.
Adams, Philippe; Abela, John R Z; Auerbach, Randy; Skitch, Steven
2009-11-01
S. J. Blatt and D. C. Zuroff's 1992 theory of personality predispositions to depression posits that individuals who possess high levels of self-criticism and/or dependency are vulnerable to developing depression following negative events. The current study used experience sampling methodology to test this theory in a sample of 49 children ages 7 to 14. Children completed measures of dependency, self-criticism, and depressive symptoms. Subsequently, children were given a handheld computer that signaled them to complete measures of depressive symptoms and negative events at randomly selected times over 2 months. Results of hierarchical linear modeling analyses indicated that higher levels of both self-criticism and dependency were associated with greater elevations in depressive symptoms following negative events. Furthermore, each personality predisposition remained a significant predictor of such elevations after controlling for the interaction between the other personality predisposition and negative events. The results suggest that dependency and self-criticism represent distinct vulnerability factors to depression in youth.
Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs
Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian
2017-01-01
In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs’ flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV’s optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity. PMID:28783100
Cloud-Assisted UAV Data Collection for Multiple Emerging Events in Distributed WSNs.
Cao, Huiru; Liu, Yongxin; Yue, Xuejun; Zhu, Wenjian
2017-08-07
In recent years, UAVs (Unmanned Aerial Vehicles) have been widely applied for data collection and image capture. Specifically, UAVs have been integrated with wireless sensor networks (WSNs) to create data collection platforms with high flexibility. However, most studies in this domain focus on system architecture and UAVs' flight trajectory planning while event-related factors and other important issues are neglected. To address these challenges, we propose a cloud-assisted data gathering strategy for UAV-based WSN in the light of emerging events. We also provide a cloud-assisted approach for deriving UAV's optimal flying and data acquisition sequence of a WSN cluster. We validate our approach through simulations and experiments. It has been proved that our methodology outperforms conventional approaches in terms of flying time, energy consumption, and integrity of data acquisition. We also conducted a real-world experiment using a UAV to collect data wirelessly from multiple clusters of sensor nodes for monitoring an emerging event, which are deployed in a farm. Compared against the traditional method, this proposed approach requires less than half the flying time and achieves almost perfect data integrity.
Motor learning characterization in people with autism spectrum disorder: A systematic review
de Moraes, Íbis Ariana Peña; Massetti, Thais; Crocetta, Tânia Brusque; da Silva, Talita Dias; de Menezes, Lilian Del Ciello; Monteiro, Carlos Bandeira de Mello; Magalhães, Fernando Henrique
2017-01-01
ABSTRACT Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder primarily characterized by deficits in social interaction, communication and implicit skill learning. OBJECTIVE: To analyse the results of research on "motor learning" and the means used for measuring "autistic disorder". METHODS: A systematic literature search was done using Medline/PubMed, Web of Science, BVS (virtual health library), and PsycINFO. We included articles that contained the keywords "autism" and "motor learning". The variables considered were the methodological aspects; results presented, and the methodological quality of the studies. RESULTS: A total of 42 studies were identified; 33 articles were excluded because they did not meet the inclusion criteria. Data were extracted from nine eligible studies and summarized. CONCLUSION: We concluded that although individuals with ASD showed performance difficulties in different memory and motor learning tasks, acquisition of skills still takes place in this population; however, this skill acquisition is related to heterogeneous events, occurring without the awareness of the individual. PMID:29213525
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Quantitative Methods in Psychology: Inevitable and Useless
Toomela, Aaro
2010-01-01
Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199
Sandhu, Amrita; Ives, Jonathan; Birchwood, Max; Upthegrove, Rachel
2013-07-01
Depression following first episode psychosis (FEP) is a frequent occurrence, with profound impact on recovery and outcome. Whilst many theories exist about the causes of depression here, research to date has been based on nosology imported wholesale from affective disorder, with little primary research on the subjective experience. This study aimed to explore the subjective experience and phenomenological features of post-psychotic depression in FEP. A qualitative methodology, photo-elicitation, together with unstructured interviews, was used to characterise aspects of depression following FEP and analysed using contemporary framework analysis. Depression was reported by participants as linked to the experience of and recovery from psychosis. The psychotic episode was a traumatic event followed by subjective doubt, shame and embarrassment. Loss and social isolation were central. Core biological symptoms did not feature. Despite the relatively small sample size, this study was able to generate in-depth data that provides useful and novel insight. Whilst generalisability is incompatible with qualitative methodology, further research using the same methodology would generate a wider range of experiences and perspectives. Understanding this dimension of psychosis in and of itself has the potential to improve and aid development of more effective and appropriately targeted interventions and associated outcomes. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.
Quantitative methods in psychology: inevitable and useless.
Toomela, Aaro
2010-01-01
Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
An integrated logit model for contamination event detection in water distribution systems.
Housh, Mashor; Ostfeld, Avi
2015-05-15
The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hamann, Cara J; Peek-Asa, Corinne
2017-05-01
Among roadway users, bicyclists are considered vulnerable due to their high risk for injury when involved in a crash. Little is known about the circumstances leading to near crashes, crashes, and related injuries or how these vary by age and gender. The purpose of this study was to examine the rates and characteristics of safety-relevant events (crashes, near crashes, errors, and traffic violations) among adult and child bicyclists. Bicyclist trips were captured using Pedal Portal, a data acquisition and coding system which includes a GPS-enabled video camera and graphical user interface. A total of 179 safety-relevant events were manually coded from trip videos. Overall, child errors and traffic violations occurred at a rate of 1.9 per 100min of riding, compared to 6.3 for adults. However, children rode on the sidewalk 56.4% of the time, compared with 12.7% for adults. For both adults and children, the highest safety-relevant event rates occurred on paved roadways with no bicycle facilities present (Adults=8.6 and Children=7.2, per 100min of riding). Our study, the first naturalistic study to compare safety-relevant events among adults and children, indicates large variation in riding behavior and exposure between child and adult bicyclists. The majority of identified events were traffic violations and we were not able to code all risk-relevant data (e.g., subtle avoidance behaviors, failure to check for traffic, probability of collision). Future naturalistic cycling studies would benefit from enhanced instrumentation (e.g., additional camera views) and coding protocols able to fill these gaps. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lew, Daniel; Yoon, Soon Man; Yan, Xiaofei; Robbins, Lori; Haritunians, Talin; Liu, Zhenqiu; Li, Dalin; McGovern, Dermot Pb
2017-10-28
To study the type and frequency of adverse events associated with anti-tumor necrosis factor (TNF) therapy and evaluate for any serologic and genetic associations. This study was a retrospective review of patients attending the inflammatory bowel disease (IBD) centers at Cedars-Sinai IBD Center from 2005-2016. Adverse events were identified via chart review. IBD serologies were measured by ELISA. DNA samples were genotyped at Cedars-Sinai using Illumina Infinium Immunochipv1 array per manufacturer's protocol. SNPs underwent methodological review and were evaluated using several SNP statistic parameters to ensure optimal allele-calling. Standard and rigorous QC criteria were applied to the genetic data, which was generated using immunochip. Genetic association was assessed by logistic regression after correcting for population structure. Altogether we identified 1258 IBD subjects exposed to anti-TNF agents in whom Immunochip data were available. 269/1258 patients (21%) were found to have adverse events to an anti-TNF-α agent that required the therapy to be discontinued. 25% of women compared to 17% of men experienced an adverse event. All adverse events resolved after discontinuing the anti-TNF agent. In total: n = 66 (5%) infusion reactions; n = 49 (4%) allergic/serum sickness reactions; n = 19 (1.5%) lupus-like reactions, n = 52 (4%) rash, n = 18 (1.4%) infections. In Crohn's disease, IgA ASCA ( P = 0.04) and IgG-ASCA ( P = 0.02) levels were also lower in patients with any adverse events, and anti-I2 level in ulcerative colitis was significantly associated with infusion reactions ( P = 0.008). The logistic regression/human annotation and network analyses performed on the Immunochip data implicated the following five signaling pathways: JAK-STAT (Janus Kinase-signal transducer and activator of transcription), measles, IBD, cytokine-cytokine receptor interaction, and toxoplasmosis for any adverse event. Our study shows 1 in 5 IBD patients experience an adverse event to anti-TNF therapy with novel serologic, genetic , and pathways associations.
Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin
2016-12-05
Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
NASA Astrophysics Data System (ADS)
Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois
2017-04-01
Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.
Simulation modeling of route guidance concept
DOT National Transportation Integrated Search
1997-01-01
The methodology of a simulation model developed at the University of New South Wales, Australia, for the evaluation of performance of Dynamic Route Guidance Systems (DRGS) is described. The microscopic simulation model adopts the event update simulat...
Nielsen, Tore A; Kuiken, Don; Alain, Geneviève; Stenstrom, Philippe; Powell, Russell A
2004-12-01
The incorporation of memories into dreams is characterized by two types of temporal effects: the day-residue effect, involving immediate incorporations of events from the preceding day, and the dream-lag effect, involving incorporations delayed by about a week. This study was designed to replicate these two effects while controlling several prior methodological problems and to provide preliminary information about potential functions of delayed event incorporations. Introductory Psychology students were asked to recall dreams at home for 1 week. Subsequently, they were instructed to select a single dream and to retrieve past events related to it that arose from one of seven randomly determined days prior to the dream (days 1-7). They then rated both their confidence in recall of events and the extent of correspondence between events and dreams. Judges evaluated qualities of the reported events using scales derived from theories about the function of delayed incorporations. Average ratings of correspondences between dreams and events were high for predream days 1 and 2, low for days 3 and 4 and high again for days 5-7, but only for participants who rated their confidence in recall of events as high and only for females. Delayed incorporations were more likely than immediate incorporations to refer to events characterized by interpersonal interactions, spatial locations, resolved problems and positive emotions. The findings are consistent with the possibility that processes with circaseptan (about 7 days) morphology underlie dream incorporation and that these processes subserve the functions of socio-emotional adaptation and memory consolidation.
Vranckx, Pascal; McFadden, Eugene; Cutlip, Donald E; Mehran, Roxana; Swart, Michael; Kint, P P; Zijlstra, Felix; Silber, Sigmund; Windecker, Stephan; Serruys, Patrick W C J
2013-01-01
Globalisation in coronary stent research calls for harmonization of clinical endpoint definitions and event adjudication. Little has been published about the various processes used for event adjudication or their impact on outcome reporting. We performed a validation of the clinical event committee (CEC) adjudication process on 100 suspected events in the RESOLUTE All-comers trial (Resolute-AC). Two experienced Clinical Research Organisations (CRO) that had already extensive internal validation processes in place, participated in the study. After initial adjudication by the primary-CEC, events were cross-adjudicated by an external-CEC using the same definitions. Major discrepancies affecting the primary end point of target-lesion failure (TLF), a composite of cardiac death, target vessel myocardial infarction (TV-MI), or clinically-indicated target-lesion revascularization (CI-TLR), were analysed by an independent oversight committee who provided recommendations for harmonization. Discordant adjudications were reconsidered by the primary CEC. Subsequently, the RAC database was interrogated for cases that based on these recommendations merited re-adjudication and these cases were also re-adjudicated by the primary CEC. Final discrepancies in adjudication of individual components of TLF occurred in 7 out of 100 events in 5 patients. Discrepancies for the (hierarchical) primary endpoint occurred in 5 events (2 cardiac deaths and 3 TV-MI). After application of harmonization recommendations to the overall RAC population (n=2292), the primary CEC adjudicated 3 additional clinical-TLRs and considered 1 TV-MI as no event. A harmonization process provided a high level of concordance for event adjudication and improved accuracy for final event reporting. These findings suggest it is feasible to pool clinical event outcome data across clinical trials even when different CECs are responsible for event adjudication. Copyright © 2012 Elsevier Inc. All rights reserved.
The mission events graphic generator software: A small tool with big results
NASA Technical Reports Server (NTRS)
Lupisella, Mark; Leibee, Jack; Scaffidi, Charles
1993-01-01
Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.
Best Practices in Pulic Outreach Events
NASA Astrophysics Data System (ADS)
Cobb, Whitney; Buxner, Sanlyn; Shipp, Stephanie
2015-11-01
IntroductionEach year the National Aeronautics and Space Administration (NASA) sponsors public outreach events designed to increase student, educator, and general public engagement in its missions and goals. NASA SMD Education’s review of large-scale events, “Best Practices in Outreach Events,” highlighted planning and implementation best practices, which were used by the Dawn mission to strategize and implement its Ceres arrival celebration event, i C Ceres.BackgroundThe literature review focused on best identifying practices rising from evaluations of large-scale public outreach events. The following criteria guided the study:* Public, science-related events open to adults and children* Events that occurred during the last 5 years* Evaluations that included information on data collected from visitors and/or volunteers* Evaluations that specified the type of data collected, methodology, and associated resultsBest Practices: Planning and ImplementationThe literature review revealed key considerations for planning implement large-scale events. Best practices included can be pertinent for all event organizers and evaluators regardless of event size. A summary of related best practices is presented below.1) Advertise the event2) Use and advertise access to scientists* Attendees who reported an interaction with a science professional were 15% to 19% more likely to report positive learning impacts, (SFA, 2012, p. 24).3) Recruit scientists using findings such as:* High percentages of scientists (85% to 96%) from most events were interested in participating again (SFA, 2012).4) Ensure that the event is group and, particularly, child friendly5) Target specific event outcomesBest Practices Informing Real-world Planning, Implementation and EvaluationDawn mission’s collaborative design of a series of events, i C Ceres, including in-person, interactive events geared to families and live presentations, will be shared, with focus on the family event, and the evidence that scientist participation was a particular driver for the event’s impact and success.Science Festival Alliance (SFA). (2012). Get inspired: A first look at science festivals. Retrieved from http://sciencefestivals.org/news_item/get-inspired
Hostile attribution biases for relationally provocative situations and event-related potentials.
Godleski, Stephanie A; Ostrov, Jamie M; Houston, Rebecca J; Schlienz, Nicolas J
2010-04-01
This exploratory study investigates how hostile attribution biases for relationally provocative situations may be related to neurocognitive processing using the P300 event-related potential. Participants were 112 (45 women) emerging adults enrolled in a large, public university in upstate New York. Participants completed self-report measures on relational aggression and hostile attribution biases and performed an auditory perseveration task to elicit the P300. It was found that hostile attribution biases for relational provocation situations was associated with a larger P300 amplitude above and beyond the role of hostile attribution biases for instrumental situations, relational aggression, and gender. Larger P300 amplitude is interpreted to reflect greater allocation of cognitive resources or enhanced "attending" to salient stimuli. Implications for methodological approaches to studying aggression and hostile attribution biases and for theory are discussed, as well as implications for the fields of developmental psychology and psychopathology. Copyright 2010 Elsevier B.V. All rights reserved.
Validation of ground-motion simulations for historical events using SDoF systems
Galasso, C.; Zareian, F.; Iervolino, I.; Graves, R.W.
2012-01-01
The study presented in this paper is among the first in a series of studies toward the engineering validation of the hybrid broadband ground‐motion simulation methodology by Graves and Pitarka (2010). This paper provides a statistical comparison between seismic demands of single degree of freedom (SDoF) systems subjected to past events using simulations and actual recordings. A number of SDoF systems are selected considering the following: (1) 16 oscillation periods between 0.1 and 6 s; (2) elastic case and four nonlinearity levels, from mildly inelastic to severely inelastic systems; and (3) two hysteretic behaviors, in particular, nondegrading–nonevolutionary and degrading–evolutionary. Demand spectra are derived in terms of peak and cyclic response, as well as their statistics for four historical earthquakes: 1979 Mw 6.5 Imperial Valley, 1989 Mw 6.8 Loma Prieta, 1992 Mw 7.2 Landers, and 1994 Mw 6.7 Northridge.
Vieira, Silvana Lima; da Silva, Gilberto Tadeu Reis; Fernandes, Josicelia Dumêt; e Silva, Ana Cláudia de Azevêdo Bião; Santana, Monique Santos; Santos, Thadeu Borges Souza
2014-01-01
Documentary, retrospective, quali-quantitative research, which aimed to examine the approaches of scientific productions through summaries of the 12 and 13 national seminar on guidelines for the education in nursing, which raised the medium-level technical professional education in nursing. quantify and discuss the scientific production of events regarding the geographical distribution of authors, objectives and methodology of the studies. As a criterion for inclusion the abstracts should contain at least one of the following descriptors in the title and/or objectives: education in nursing, technical education in nursing, vocational education. Reviewed 637 abstracts, 23 met the inclusion criteria. The production was concentrated in the Southeast region in Brazil and the objectives featured mainly teaching practices of the nurses. There was little expressiveness of vocational at the event, signaling the need for more studies, considering the relevance of the technical level professionals for the Nursing profession and the health system.
Multi-year microbial source tracking study characterizing fecal contamination in an urban watershed
Bushon, Rebecca N.; Brady, Amie M. G.; Christensen, Eric D.; Stelzer, Erin A.
2017-01-01
Microbiological and hydrological data were used to rank tributary stream contributions of bacteria to the Little Blue River in Independence, Missouri. Concentrations, loadings and yields of E. coli and microbial source tracking (MST) markers, were characterized during base flow and storm events in five subbasins within Independence, as well as sources entering and leaving the city through the river. The E. coli water quality threshold was exceeded in 29% of base-flow and 89% of storm-event samples. The total contribution of E. coli and MST markers from tributaries within Independence to the Little Blue River, regardless of streamflow, did not significantly increase the median concentrations leaving the city. Daily loads and yields of E. coli and MST markers were used to rank the subbasins according to their contribution of each constituent to the river. The ranking methodology used in this study may prove useful in prioritizing remediation in the different subbasins.
Sheltering the self from the storm: self-construal abstractness and the stability of self-esteem.
Updegraff, John A; Emanuel, Amber S; Suh, Eunkook M; Gallagher, Kristel M
2010-01-01
Self-construal abstractness (SCA) refers to the degree to which people construe important bases of self-esteem in a broad, flexible, and abstract rather than a concrete and specific manner. This article hypothesized that SCA would be a unique predictor of self-esteem stability, capturing the degree to which people's most important bases of self-worth are resistant to disconfirmation. Two studies using a daily diary methodology examined relationships between SCA, daily self-esteem, and daily emotions and/or events. In Study 1, individual differences in SCA emerged as the most consistent and unique predictor of self-esteem stability. Furthermore, SCA contributed to self-esteem stability by buffering the influence of daily negative emotions on self-esteem. Study 2 manipulated SCA via a daily self-construal task and found an abstract versus concrete self-focus to buffer the influence of daily negative events on self-esteem. Implications of these findings for the study of the self and well-being are discussed.
Ruberg, Joshua L; Helm, C William; Felleman, Benjamin I; Helm, Jane E; Studts, Jamie L
2017-02-01
Many studies have examined the relationship between worry and cancer screening. Due to methodological inconsistencies, results of these studies have varied and few conclusions can be made when generalizing across studies. The purpose of the current study was to better understand the worry-cancer screening relationship using a prospective research design. 180 women enrolled in an annual ovarian cancer (OC) screening clinic completed surveys at three time points-pre-screening, day of screening, and post-screening-using three measures of cancer-specific worry. OC worry was highest in the weeks prior to screening and mere presentation at a screening clinic was associated with a significant worry decline. Observed elevations in worry following abnormal screening were not universal and varied by the instrument used to measure worry. In contrast to our hypotheses, it appears that mere presentation at a cancer screening clinic may be a worry-reducing event. Receipt of abnormal results was not necessarily associated with increased worry. Published by Elsevier Inc.
Bayesian truthing as experimental verification of C4ISR sensors
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew
2015-05-01
In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
A Systematic Review of Herbal Medicine for Chemotherapy Induced Peripheral Neuropathy
Noh, Hyeonseok
2018-01-01
Background Chemotherapy-induced peripheral neuropathy (CIPN) is a common adverse effect in cancer patients. The aim of this review was to assess the effectiveness of herbal medicine in preventing and treating CIPN. Methods Randomised controlled trials were included in this review. Extracting and assessing the data independently, two authors searched 13 databases. Results Twenty-eight trials involving 2174 patients met the inclusion criteria. Although there were some exceptions, the methodological quality was typically low. Seventeen trials reported the incidence rate of CIPN assessed by various tools and 14 showed a significant difference regarding the decrease of the incidence rate between the two groups. For clinical improvement, 12 trials reported it using various tools and 10 showed a significant difference between two groups. Two cases of adverse events occurred in one trial; the other nine trials reported no adverse events. Conclusions We found that herbal medicines in combination with and/or without other therapies potentially have preventive or therapeutic effects on CIPN. However, conclusions cannot be drawn because of the generally low quality of the methodology, the clinical heterogeneity, and the small sample size for each single herbal medicine. Trials that are more rigorous and report sufficient methodological data are needed. PMID:29636782
NASA Technical Reports Server (NTRS)
Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.
1989-01-01
The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.
Somatization in survivors of catastrophic trauma: a methodological review.
North, Carol S
2002-01-01
The literature on mental health effects of catastrophic trauma such as community disasters focuses on posttraumatic stress disorder. Somatization disorder is not listed among the classic responses to disaster, nor have other somatoform disorders been described in this literature. Nondiagnostic "somatization," "somatization symptoms," and "somatic symptoms" form the basis of most information about somatization in the literature. However, these concepts have not been validated, and therefore this work suffers from multiple methodological problems of ascertainment and interpretation. Future research is encouraged to consider many methodological issues in obtaining adequate data to address questions about the association of somatization with traumatic events, including a) appropriate comparison groups, b) satisfactory definition and measurement of somatization, c) exclusion of medical explanations for the symptoms, d) recognition of somatizers' spurious attribution of symptoms to medical causes, e) collection of data from additional sources beyond single-subject interviews, f) validation of diagnosis-unrelated symptom reporting or reconsideration of symptoms within diagnostic frameworks, g) separation of somatization after an event into new (incident) and preexisting categories, h) development of research models that include sufficient variables to examine the broader scope of potential relationships, and i) novel consideration of alternative causal directionalities. PMID:12194899
Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent
2018-06-01
Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.
Monitoring the fracture behavior of metal matrix composites by combined NDE methodologies
NASA Astrophysics Data System (ADS)
Kordatos, E. Z.; Exarchos, D. A.; Mpalaskas, A. C.; Matikas, T. E.
2015-03-01
Current work deals with the non-destructive evaluation (NDE) of the fatigue behavior of metal matrix composites (MMCs) materials using Infrared Thermography (IRT) and Acoustic Emission (AE). AE monitoring was employed to record a wide spectrum of cracking events enabling the characterization of the severity of fracture in relation to the applied load. IR thermography as a non-destructive, real-time and non-contact technique, allows the detection of heat waves generated by the thermo-mechanical coupling during mechanical loading of the sample. In this study an IR methodology, based on the monitoring of the intrinsically dissipated energy, was applied for the determination of the fatigue limit of A359/SiCp composites. The thermographic monitoring is in agreement with the AE results enabling the reliable monitoring of the MMCs' fatigue behavior.
NASA Astrophysics Data System (ADS)
Tene, Yair; Tene, Noam; Tene, G.
1993-08-01
An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.
VEEP - Vehicle Economy, Emissions, and Performance program
NASA Technical Reports Server (NTRS)
Heimburger, D. A.; Metcalfe, M. A.
1977-01-01
VEEP is a general-purpose discrete event simulation program being developed to study the performance, fuel economy, and exhaust emissions of a vehicle modeled as a collection of its separate components. It is written in SIMSCRIPT II.5. The purpose of this paper is to present the design methodology, describe the simulation model and its components, and summarize the preliminary results. Topics include chief programmer team concepts, the SDDL design language, program portability, user-oriented design, the program's user command syntax, the simulation procedure, and model validation.
Ferretting out the facts behind the H5N1 controversy.
Sleator, Roy D
2012-01-01
Recent recommendations by the National Science Advisory Board for Biosecurity (NSABB) to redact key methodological details of two studies involving mammal-to-mammal transmission of the H5N1 (H5) subtype influenza viruses, has led to a temporary moratorium on all research involving live H5N1 or H5 HA reassortant viruses shown to be transmissible in ferrets. Herein, I review the events which led to this impasse and comment on their impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zarkesh, Ryan A.; Foster, Michael E.; Ichimura, Andrew S.
The ability to tune the steric envelope through redox events post-synthetically or in tandem with other chemical processes is a powerful tool that could assist in enabling new catalytic methodologies and understanding potential pitfalls in ligand design. The α-diimine ligand, dmp-BIAN, exhibits the peculiar and previously unreported feature of varying steric profiles depending on oxidation state when paired with a main group element. A study of the factors that give rise to this behaviour as well as its impact on the incorporation of other ligands is performed.
2016-07-01
and gap propagation engineering methodology implemented within the software (CI-Wall) makes use of a hydraulic fracturing criterion, as discussed in...moist unit weight). Soil unit weights: Because of the presence of the upper moist (i.e, non - saturated) region R01 clay layer that is immediately...from two series of complete soil-structure interaction (SSI) non - linear finite element studies for I-Walls at New Orleans and other locations
1980-04-01
a detailed account of the methodology used for information elicitation and organization. Chapter 3 describes the study results in terms of clus- ters...European Setting)., ’The scenario (see Appendix A-2) contained the following elements: (1) an account of the events ledding up to the present tactical...As I read these statements I’d like you to think about the appli- cation of these principles to graphic portrayal. For example, the first two
An information-theoretic approach to surrogate-marker evaluation with failure time endpoints.
Pryseley, Assam; Tilahun, Abel; Alonso, Ariel; Molenberghs, Geert
2011-04-01
Over the last decades, the evaluation of potential surrogate endpoints in clinical trials has steadily been growing in importance, not only thanks to the availability of ever more potential markers and surrogate endpoints, also because more methodological development has become available. While early work has been devoted, to a large extent, to Gaussian, binary, and longitudinal endpoints, the case of time-to-event endpoints is in need of careful scrutiny as well, owing to the strong presence of such endpoints in oncology and beyond. While work had been done in the past, it was often cumbersome to use such tools in practice, because of the need for fitting copula or frailty models that were further embedded in a hierarchical or two-stage modeling approach. In this paper, we present a methodologically elegant and easy-to-use approach based on information theory. We resolve essential issues, including the quantification of "surrogacy" based on such an approach. Our results are put to the test in a simulation study and are applied to data from clinical trials in oncology. The methodology has been implemented in R.
A Probabilistic Analysis of Surface Water Flood Risk in London.
Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris
2018-06-01
Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Marcella, M. P.; CHEN, C.; Senarath, S. U.
2013-12-01
Much work has been completed in analyzing Southeast Asia's tropical cyclone climatology and the associated flooding throughout the region. Although, an active and strong monsoon season also brings major flooding across the Philippines resulting in the loss of lives and significant economic impacts, only a limited amount of research work has been conducted to investigate the frequency and flood loss estimates of these non-tropical cyclone (TC) storms. In this study, using the TRMM 3-hourly rainfall product, tropical cyclone rainfall is removed to construct a non-TC rainfall climatology across the region. Given this data, stochastically generated rainfall that is both spatially and temporally correlated across the country is created to generate a longer historically-based record of non-TC precipitation. After defining the rainfall criteria that constitutes a flood event based on observed floods and TRMM data, this event definition is applied to the stochastic catalog of rainfall to determine flood events. Subsequently, a thorough analysis of non-TC flood extremes, frequency, and distribution is completed for the country of the Philippines. As a result, the above methodology and datasets provide a unique opportunity to further study flood occurrences and their extremes across most of South East Asia.
NASA Astrophysics Data System (ADS)
Bakal, Jeffrey A.; Ezekowitz, Justin A.; Westerhout, Cynthia M.; Boersma, Eric; Armstrong, Paul W.
2013-05-01
The aim of this study was to develop a method for the identification of global weather parameters and patient characteristics associated with a type of heart attack in which there is a sudden partial blockage of a coronary artery. This type of heart attack does not demonstrate an elevation of the ST segment on an electrocardiogram and is defined as a non-ST elevation acute coronary syndrome (NSTE-ACS). Data from the Global Summary of the Day database was linked with the enrollment and baseline data for a phase III international clinical trial in NSTE-ACS in four 48-h time periods covering the week prior to the clinical event that prompted enrollment in the study. Meteorological events were determined by standardizing the weather data from enrollment dates against an empirical distribution from the month prior. These meteorological events were then linked to the patients' geographic region, demographics and comorbidities to identify potential susceptible populations. After standardization, changes in temperature and humidity demonstrated an association with the enrollment event. Additionally there appeared to be an association with gender, region and a history of stroke. This methodology may provide a useful global insight into assessing the biometeorologic component of diseases from international data.
NASA Astrophysics Data System (ADS)
Shimizu, Y.; Ishizuka, T.; Osanai, N.; Okazumi, T.
2014-12-01
In this study, the sediment-related disaster prediction method which based ground gauged rainfall-data, currently practiced in Japan was coupled with satellite rainfall data and applied to domestic large-scale sediment-related disasters. The study confirmed the feasibility of this integrated method. In Asia, large-scale sediment-related disasters which can sweep away an entire settlement occur frequently. Leyte Island suffered from a huge landslide in 2004, and Typhoon Molakot in 2009 caused huge landslides in Taiwan. In the event of these sediment-related disasters, immediate responses by central and local governments are crucial in crisis management. In general, there are not enough rainfall gauge stations in developing countries. Therefore national and local governments have little information to determine the risk level of water induced disasters in their service areas. In the Japanese methodology, a criterion is set by combining two indices: the short-term rainfall index and long-term rainfall index. The short-term rainfall index is defined as the 60-minute total rainfall; the long-term rainfall index as the soil-water index, which is an estimation of the retention status of fallen rainfall in soil. In July 2009, a high-density sediment related disaster, or a debris flow, occurred in Hofu City of Yamaguchi Prefecture, in the western region of Japan. This event was calculated by the Japanese standard methodology, and then analyzed for its feasibility. Hourly satellite based rainfall has underestimates compared with ground based rainfall data. Long-term index correlates with each other. Therefore, this study confirmed that it is possible to deliver information on the risk level of sediment-related disasters such as shallow landslides and debris flows. The prediction method tested in this study is expected to assist for timely emergency responses to rainfall-induced natural disasters in sparsely gauged areas. As the Global Precipitation Measurement (GPM) Plan progresses, spatial resolution, time resolution and accuracy of rainfall data should be further improved and will be more effective in practical use.
John-Baptiste, Ava A.; Wu, Wei; Rochon, Paula; Anderson, Geoffrey M.; Bell, Chaim M.
2013-01-01
Background A key priority in developing policies for providing affordable cancer care is measuring the value for money of new therapies using cost-effectiveness analyses (CEAs). For CEA to be useful it should focus on relevant outcomes and include thorough investigation of uncertainty. Randomized controlled trials (RCTs) of five years of aromatase inhibitors (AI) versus five years of tamoxifen in the treatment of post-menopausal women with early stage breast cancer, show benefit of AI in terms of disease free survival (DFS) but not overall survival (OS) and indicate higher risk of fracture with AI. Policy-relevant CEA of AI versus tamoxifen should focus on OS and include analysis of uncertainty over key assumptions. Methods We conducted a systematic review of published CEAs comparing an AI to tamoxifen. We searched Ovid MEDLINE, EMBASE, PsychINFO, and the Cochrane Database of Systematic Reviews without language restrictions. We selected CEAs with outcomes expressed as cost per life year or cost per quality adjusted life year (QALY). We assessed quality using the Neumann checklist. Using structured forms two abstractors collected descriptive information, sources of data, baseline assumptions on effectiveness and adverse events, and recorded approaches to assessing parameter uncertainty, methodological uncertainty, and structural uncertainty. Results We identified 1,622 citations and 18 studies met inclusion criteria. All CE estimates assumed a survival benefit for aromatase inhibitors. Twelve studies performed sensitivity analysis on the risk of adverse events and 7 assumed no additional mortality risk with any adverse event. Sub-group analysis was limited; 6 studies examined older women, 2 examined women with low recurrence risk, and 1 examined women with multiple comorbidities. Conclusion Published CEAs comparing AIs to tamoxifen assumed an OS benefit though none has been shown in RCTs, leading to an overestimate of the cost-effectiveness of AIs. Results of these CEA analyses may be suboptimal for guiding policy. PMID:23671612
John-Baptiste, Ava A; Wu, Wei; Rochon, Paula; Anderson, Geoffrey M; Bell, Chaim M
2013-01-01
A key priority in developing policies for providing affordable cancer care is measuring the value for money of new therapies using cost-effectiveness analyses (CEAs). For CEA to be useful it should focus on relevant outcomes and include thorough investigation of uncertainty. Randomized controlled trials (RCTs) of five years of aromatase inhibitors (AI) versus five years of tamoxifen in the treatment of post-menopausal women with early stage breast cancer, show benefit of AI in terms of disease free survival (DFS) but not overall survival (OS) and indicate higher risk of fracture with AI. Policy-relevant CEA of AI versus tamoxifen should focus on OS and include analysis of uncertainty over key assumptions. We conducted a systematic review of published CEAs comparing an AI to tamoxifen. We searched Ovid MEDLINE, EMBASE, PsychINFO, and the Cochrane Database of Systematic Reviews without language restrictions. We selected CEAs with outcomes expressed as cost per life year or cost per quality adjusted life year (QALY). We assessed quality using the Neumann checklist. Using structured forms two abstractors collected descriptive information, sources of data, baseline assumptions on effectiveness and adverse events, and recorded approaches to assessing parameter uncertainty, methodological uncertainty, and structural uncertainty. We identified 1,622 citations and 18 studies met inclusion criteria. All CE estimates assumed a survival benefit for aromatase inhibitors. Twelve studies performed sensitivity analysis on the risk of adverse events and 7 assumed no additional mortality risk with any adverse event. Sub-group analysis was limited; 6 studies examined older women, 2 examined women with low recurrence risk, and 1 examined women with multiple comorbidities. Published CEAs comparing AIs to tamoxifen assumed an OS benefit though none has been shown in RCTs, leading to an overestimate of the cost-effectiveness of AIs. Results of these CEA analyses may be suboptimal for guiding policy.
Towards a monitoring system of temperature extremes in Europe
NASA Astrophysics Data System (ADS)
Lavaysse, Christophe; Cammalleri, Carmelo; Dosio, Alessandro; van der Schrier, Gerard; Toreti, Andrea; Vogt, Jürgen
2018-01-01
Extreme-temperature anomalies such as heat and cold waves may have strong impacts on human activities and health. The heat waves in western Europe in 2003 and in Russia in 2010, or the cold wave in southeastern Europe in 2012, generated a considerable amount of economic loss and resulted in the death of several thousands of people. Providing an operational system to monitor extreme-temperature anomalies in Europe is thus of prime importance to help decision makers and emergency services to be responsive to an unfolding extreme event. In this study, the development and the validation of a monitoring system of extreme-temperature anomalies are presented. The first part of the study describes the methodology based on the persistence of events exceeding a percentile threshold. The method is applied to three different observational datasets, in order to assess the robustness and highlight uncertainties in the observations. The climatology of extreme events from the last 21 years is then analysed to highlight the spatial and temporal variability of the hazard, and discrepancies amongst the observational datasets are discussed. In the last part of the study, the products derived from this study are presented and discussed with respect to previous studies. The results highlight the accuracy of the developed index and the statistical robustness of the distribution used to calculate the return periods.
Andrade, Cristiane Ps; Souza, Cláudio J; Camerini, Eduardo Sn; Alves, Isabela S; Vital, Hélio C; Healy, Matthew Jf; Ramos De Andrade, Edson
2018-06-01
A radiological dispersive device (RDD) spreads radioactive material, complicates the treatment of physical injuries, raises cancer risk, and induces disproportionate fear. Simulating such an event enables more effective and efficient utilization of the triage and treatment resources of staff, facilities, and space. Fast simulation can give detail on events in progress or future events. The resources for triage and treatment of contaminated trauma victims can differ for pure exposure individuals, while discouraging the "worried well" from presenting in the crisis phase by media announcement would relieve pressure on hospital facilities. The proposed methodology integrates capabilities from different platforms in a convergent way composed of three phases: (a) scenario simulation, (b) data generation, and (c) risk assessment for triage focused on follow-up epidemiological assessment. Simulations typically indicate that most of the affected population does not require immediate medical assistance. Medical triage for the few severely injured and the radiological triage to diminish the contamination with radioactivity will always be the priority. For this study, however, higher priorities should be given to individuals from radiological "warm" and "hot" zones as required by risk criteria. The proposed methodology could thus help to (a) filter and reduce the number of individuals to be attended, (b) optimize the prioritization of medical care, (c) reduce or prepare for future costs, (d) effectively locate the operational triage site to avoid possible contamination on the main facility, and (e) provide the scientific data needed to develop an adequate approach to risk and its proper communication.
NASA Technical Reports Server (NTRS)
Sarani, Siamak
2010-01-01
This paper describes a methodology for accurate and flight-calibrated determination of the on-times of the Cassini spacecraft Reaction Control System (RCS) thrusters, without any form of dynamic simulation, for the reaction wheel biases. The hydrazine usage and the delta V vector in body frame are also computed from the respective thruster on-times. The Cassini spacecraft, the largest and most complex interplanetary spacecraft ever built, continues to undertake ambitious and unique scientific observations of planet Saturn, Titan, Enceladus, and other moons of Saturn. In order to maintain a stable attitude during the course of its mission, this three-axis stabilized spacecraft uses two different control systems: the RCS and the reaction wheel assembly control system. The RCS is used to execute a commanded spacecraft slew, to maintain three-axis attitude control, control spacecraft's attitude while performing science observations with coarse pointing requirements, e.g. during targeted low-altitude Titan and Enceladus flybys, bias the momentum of reaction wheels, and to perform RCS-based orbit trim maneuvers. The use of RCS often imparts undesired delta V on the spacecraft. The Cassini navigation team requires accurate predictions of the delta V in spacecraft coordinates and inertial frame resulting from slews using RCS thrusters and more importantly from reaction wheel bias events. It is crucial for the Cassini spacecraft attitude control and navigation teams to be able to, quickly but accurately, predict the hydrazine usage and delta V for various reaction wheel bias events without actually having to spend time and resources simulating the event in flight software-based dynamic simulation or hardware-in-the-loop simulation environments. The methodology described in this paper, and the ground software developed thereof, are designed to provide just that. This methodology assumes a priori knowledge of thrust magnitudes and thruster pulse rise and tail-off time constants for eight individual attitude control thrusters, the spacecraft's wet mass and its center of mass location, and a few other key parameters.
NASA Astrophysics Data System (ADS)
Goteti, G.; Kaheil, Y. H.; Katz, B. G.; Li, S.; Lohmann, D.
2011-12-01
In the United States, government agencies as well as the National Flood Insurance Program (NFIP) use flood inundation maps associated with the 100-year return period (base flood elevation, BFE), produced by the Federal Emergency Management Agency (FEMA), as the basis for flood insurance. A credibility check of the flood risk hydraulic models, often employed by insurance companies, is their ability to reasonably reproduce FEMA's BFE maps. We present results from the implementation of a flood modeling methodology aimed towards reproducing FEMA's BFE maps at a very fine spatial resolution using a computationally parsimonious, yet robust, hydraulic model. The hydraulic model used in this study has two components: one for simulating flooding of the river channel and adjacent floodplain, and the other for simulating flooding in the remainder of the catchment. The first component is based on a 1-D wave propagation model, while the second component is based on a 2-D diffusive wave model. The 1-D component captures the flooding from large-scale river transport (including upstream effects), while the 2-D component captures the flooding from local rainfall. The study domain consists of the contiguous United States, hydrologically subdivided into catchments averaging about 500 km2 in area, at a spatial resolution of 30 meters. Using historical daily precipitation data from the Climate Prediction Center (CPC), the precipitation associated with the 100-year return period event was computed for each catchment and was input to the hydraulic model. Flood extent from the FEMA BFE maps is reasonably replicated by the 1-D component of the model (riverine flooding). FEMA's BFE maps only represent the riverine flooding component and are unavailable for many regions of the USA. However, this modeling methodology (1-D and 2-D components together) covers the entire contiguous USA. This study is part of a larger modeling effort from Risk Management Solutions° (RMS) to estimate flood risk associated with extreme precipitation events in the USA. Towards this greater objective, state-of-the-art models of flood hazard and stochastic precipitation are being implemented over the contiguous United States. Results from the successful implementation of the modeling methodology will be presented.
Soil moisture retrival from Sentinel-1 and Modis synergy
NASA Astrophysics Data System (ADS)
Gao, Qi; Zribi, Mehrez; Escorihuela, Maria Jose; Baghdadi, Nicolas
2017-04-01
This study presents two methodologies retrieving soil moisture from SAR remote sensing data. The study is based on Sentinel-1 data in the VV polarization, over a site in Urgell, Catalunya (Spain). In the two methodologies using change detection techniques, preprocessed radar data are combined with normalized difference vegetation index (NDVI) auxiliary data to estimate the mean soil moisture with a resolution of 1km. By modeling the relationship between the backscatter difference and NDVI, the soil moisture at a specific NDVI value is retrieved. The first algorithm is already developed on West Africa(Zribi et al., 2014) from ERS scatterometer data to estimate soil water status. In this study, it is adapted to Sentinel-1 data and take into account the high repetitiveness of data in optimizing the inversion approach. Another new method is developed based on the backscatter difference between two adjacent days of Sentinel-1 data w.r.t. NDVI, with smaller vegetation change, the backscatter difference is more sensitive to soil moisture. The proposed methodologies have been validated with the ground measurement in two demonstrative fields with RMS error about 0.05 (in volumetric moisture), and the coherence between soil moisture variations and rainfall events is observed. Soil moisture maps at 1km resolution are generated for the study area. The results demonstrate the potential of Sentinel-1 data for the retrieval of soil moisture at 1km or even better resolution.
SociAL Sensor Analytics: Measuring Phenomenology at Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.
The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less
Tsunami and shelf resonance on the northern Chile coast
NASA Astrophysics Data System (ADS)
Cortés, Pablo; Catalán, Patricio A.; Aránguiz, Rafael; Bellotti, Giorgio
2017-09-01
This work presents the analysis of long waves resonance in two of the main cities along the northern coast of Chile, Arica, and Iquique, where a large tsunamigenic potential remains despite recent earthquakes. By combining a modal analysis solving the equation of free surface oscillations, with the analysis of background spectra derived from in situ measurements, the spatial and temporal structures of the modes are recovered. Comparison with spectra from three tsunamis of different characteristics shows that the modes found have been excited by past events. Moreover, the two locations show different response patterns. Arica is more sensitive to the characteristics of the tsunami source, whereas Iquique shows a smaller dependency and similar response for different tsunami events. Results are further compared with other methodologies with good agreement. These findings are relevant in characterizing the tsunami hazard in the area, and the methodology can be further extended to other regions along the Chilean coast.
NASA Astrophysics Data System (ADS)
Forte, F.; Strobl, R. O.; Pennetta, L.
2006-07-01
The impact of calamitous meteoric events and their interaction with the geological and geomorphological environment represent a current problem of the Supersano-Ruffano-Nociglia Graben in southern Italy. Indeed, severe floods take place on a frequent basis not only in autumn and winter, but in summer also. These calamities are not only triggered by exceptional events, but are also amplified by peculiar geological and morpho-structural characteristics of the Graben. Flooding often affects vast agricultural areas and consequently, water-scooping machines cannot remove the rainwater. These events cause warnings and emergency states, involving people as well as socio economic goods. This study represents an application of a vanguard technique for loss estimation and flood vulnerability analysis, integrating a geographic information system (GIS) with aerial photos and remote sensing methods. The analysis results clearly show that the Graben area is potentially at greatest flood vulnerability, while along the Horsts the flood vulnerability is lower.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro
This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less
Psychic trauma as cause of death.
Terranova, C; Snenghi, R; Thiene, G; Ferrara, S D
2011-01-01
of study Psychic trauma is described as the action of 'an emotionally overwhelming factor' capable of causing neurovegetative alterations leading to transitory or persisting bodily changes. The medico-legal concept of psychic trauma and its definition as a cause in penal cases is debated. The authors present three cases of death after psychic trauma, and discuss the definition of cause within the penal ambit of identified 'emotionally overwhelming factors'. The methodological approach to ascertainment and criterion-based assessment in each case involved the following phases: (1) examination of circumstantial evidence, clinical records and documentation; (2) autopsy; (3) ascertainment of cause of death; and (4) ascertainment of psychic trauma, and its coexisting relationship with the cause of death. The results and assessment of each of the three cases are discussed from the viewpoint of the causal connotation of psychic trauma. In the cases presented, psychic trauma caused death, as deduced from assessment of the type of externally caused emotional insult, the subjects' personal characteristics and the circumstances of the event causing death. In cases of death due to psychic trauma, careful methodological ascertainment is essential, with the double aim of defining 'emotionally overwhelming factors' as a significant cause of death from the penal point of view, and of identifying the responsibility of third parties involved in the death event and associated dynamics of homicide.
Cannabinoids for nausea and vomiting related to chemotherapy: Overview of systematic reviews.
Schussel, Victor; Kenzo, Lucas; Santos, Andreia; Bueno, Júlia; Yoshimura, Ellen; de Oliveira Cruz Latorraca, Carolina; Pachito, Daniela Vianna; Riera, Rachel
2018-04-01
Nausea and vomiting are common and distressing adverse events of chemotherapy. This review focuses on the findings and quality of systematic reviews (SRs) of cannabinoids for chemotherapy-induced nausea and vomiting (CINV). Review of SRs, a systematic literature search, was conducted in several electronic databases and included SRs evaluating cannabinoids for CINV in cancer patients. Methodological quality and quality of reporting were evaluated by AMSTAR and PRISMA, respectively. Initial search retrieved 2,206 records, and 5 SRs were included. On the basis of findings of the sole SR judged as high methodological quality, cannabinoids seem to be more effective than placebo, equal to prochlorperazine for reducing CINV, and to be preferred by patients. The response to different combinations of antiemetic agents seems to be equal to 1 antiemetic alone. The average of AMSTAR score was 5, and the average of PRISMA score was 13.2. Cannabinoids represent a valuable option for treating CINV, despite the adverse events related to treatment, such as drowsiness and cognitive impairment. There is no good quality evidence to recommend or not the use of cannabinoids for CINV. More studies are still needed to evaluate the effectiveness of cannabinoids when compared with modern antiemetics. Copyright © 2017 John Wiley & Sons, Ltd.
Wessels, Francois
2010-01-01
This project was based on the FIELD trial. It is a localisation of the study by Carrington and Stewart. The aim of the original study was to determine the impact of fenofibrate therapy on healthcare costs of middle-aged patients with type 2 diabetes at high risk of future cardiovascular events. The methodology used in the Carrington article was adopted for this study. The clinical foundation for the analysis was derived from the findings of the FIELD study. All costs were sourced from electronic databases obtained from private-sector South African funders of healthcare. Event costs for the cardiovascular events were determined and added to the treatment costs for the individual treatment arms. The cost saving was determined as the difference between the event costs saved and the additional treatment costs associated with fenofibrate treatment. All costs were reported as 2008 ZAR and a discount rate of 10% was used. The study adopted a South African private-sector funder perspective. If the same approach is followed as in the Carrington and Stewart study, a cost saving of 18% results. This is the difference between the total costs associated with the placebo and fenofibrate arms, respectively (R3 480 471 compared to R2 858 598 per 1 000 patient years for the placebo and fenofibrate arms, respectively). The total costs were determined as the sum of associated event costs and treatment costs for each of the comparators. Based on this exploratory analysis, it seems that Lipanthyl treatment in middle-aged patients resulted in a cost saving due to the prevention of cardiovascular events when it was used in the treatment of type 2 diabetics, as in the FIELD study. It should therefore be considered to be cost effective, even when just the cardiovascular risk reduction effect is considered.
NASA Astrophysics Data System (ADS)
Dhakal, N.; Jain, S.
2013-12-01
Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno
2006-03-31
In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.
Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles
2007-01-01
Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203
Parkes, Olga; Lettieri, Paola; Bogle, I David L
2016-02-01
This paper presents a novel quantitative methodology for the evaluation and optimisation of the environmental impacts of the whole life cycle of a mega-event project: construction and staging the event and post-event site redevelopment and operation. Within the proposed framework, a mathematical model has been developed that takes into account greenhouse gas (GHG) emissions resulting from use of transportation fuel, energy, water and construction materials used at all stages of the mega-event project. The model is applied to a case study - the London Olympic Park. Three potential post-event site design scenarios of the Park have been developed: Business as Usual (BAU), Commercial World (CW) and High Rise High Density (HRHD). A quantitative summary of results demonstrates that the highest GHG emissions associated with the actual event are almost negligible compared to those associated with the legacy phase. The highest share of emissions in the legacy phase is attributed to embodied emissions from construction materials (almost 50% for the BAU and HRHD scenarios) and emissions resulting from the transportation of residents, visitors and employees to/from the site (almost 60% for the CW scenario). The BAU scenario is the one with the lowest GHG emissions compared to the other scenarios. The results also demonstrate how post-event site design scenarios can be optimised to minimise the GHG emissions. The overall outcomes illustrate how the proposed framework can be used to support decision making process for mega-event projects planning. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Li, Chunxiao; Khoo, Selina; Adnan, Athirah
2017-03-01
The aim of this review is to synthesize the evidence on the effects of aquatic exercise interventions on physical function and fitness among people with spinal cord injury. Six major databases were searched from inception till June 2015: MEDLINE, CINAHL, EMBASE, PsychInfo, SPORTDiscus, and Cochrane Center Register of Controlled Trials. Two reviewers independently rated methodological quality using the modified Downs and Black Scale and extracted and synthesized key findings (i.e., participant characteristics, study design, physical function and fitness outcomes, and adverse events). Eight of 276 studies met the inclusion criteria, of which none showed high research quality. Four studies assessed physical function outcomes and 4 studies evaluated aerobic fitness as outcome measures. Significant improvements on these 2 outcomes were generally found. Other physical or fitness outcomes including body composition, muscular strength, and balance were rarely reported. There is weak evidence supporting aquatic exercise training to improve physical function and aerobic fitness among adults with spinal cord injury. Suggestions for future research include reporting details of exercise interventions, evaluating other physical or fitness outcomes, and improving methodological quality.
Gene environment interaction studies in depression and suicidal behavior: An update.
Mandelli, Laura; Serretti, Alessandro
2013-12-01
Increasing evidence supports the involvement of both heritable and environmental risk factors in major depression (MD) and suicidal behavior (SB). Studies investigating gene-environment interaction (G × E) may be useful for elucidating the role of biological mechanisms in the risk for mental disorders. In the present paper, we review the literature regarding the interaction between genes modulating brain functions and stressful life events in the etiology of MD and SB and discuss their potential added benefit compared to genetic studies only. Within the context of G × E investigation, thus far, only a few reliable results have been obtained, although some genes have consistently shown interactive effects with environmental risk in MD and, to a lesser extent, in SB. Further investigation is required to disentangle the direct and mediated effects that are common or specific to MD and SB. Since traditional G × E studies overall suffer from important methodological limitations, further effort is required to develop novel methodological strategies with an interdisciplinary approach. Copyright © 2013 Elsevier Ltd. All rights reserved.
Evolvement of Uniformity and Volatility in the Stressed Global Financial Village
Kenett, Dror Y.; Raddant, Matthias; Lux, Thomas; Ben-Jacob, Eshel
2012-01-01
Background In the current era of strong worldwide market couplings the global financial village became highly prone to systemic collapses, events that can rapidly sweep throughout the entire village. Methodology/Principal Findings We present a new methodology to assess and quantify inter-market relations. The approach is based on the correlations between the market index, the index volatility, the market Index Cohesive Force and the meta-correlations (correlations between the intra-correlations.) We investigated the relations between six important world markets—U.S., U.K., Germany, Japan, China and India—from January 2000 until December 2010. We found that while the developed “western” markets (U.S., U.K., Germany) are highly correlated, the interdependencies between these markets and the developing “eastern” markets (India and China) are volatile and with noticeable maxima at times of global world events. The Japanese market switches “identity”—it switches between periods of high meta-correlations with the “western” markets and periods when it behaves more similarly to the “eastern” markets. Conclusions/Significance The methodological framework presented here provides a way to quantify the evolvement of interdependencies in the global market, evaluate a world financial network and quantify changes in the world inter market relations. Such changes can be used as precursors to the agitation of the global financial village. Hence, the new approach can help to develop a sensitive “financial seismograph” to detect early signs of global financial crises so they can be treated before they develop into worldwide events. PMID:22347444
Rogers, Paul; Qualter, Pamela; Wood, Dave
2016-11-01
Two studies examine the impact event vividness, event severity, and prior paranormal belief has on causal attributions for a depicted remarkable coincidence experience. In Study 1, respondents (n = 179) read a hypothetical vignette in which a fictional character accurately predicts a plane crash 1 day before it occurs. The crash was described in either vivid or pallid terms with the final outcome being either severe (fatal) or non-severe (non-fatal). Respondents completed 29 causal attribution items, one attribution confidence item, nine scenario perception items, a popular paranormal belief scale, and a standard demographics questionnaire. Principal axis factoring reduced the 29 attribution items to four attribution factors which were then subjected to a 2 (event vividness) × 2 (event severity) × 2 (paranormal belief) MANCOVA controlling for respondent gender. As expected, paranormal believers attributed the accurate crash prediction less to coincidence and more to both paranormal and transcendental knowing than did paranormal sceptics. Furthermore, paranormal (psychokinesis) believers deemed the prediction more reflective of paranormal knowing to both (1) a vivid/non-fatal and (2) a pallid/fatal crash depiction. Vividness, severity, and paranormal belief types had no impact on attribution confidence. In Study 2, respondents (also n = 179) generated data that were a moderately good fit to the previous factor structure and replicated several differences across attributional pairings albeit for paranormal non-believers only. Corresponding effects for event severity and paranormal belief were not replicated. Findings are discussed in terms of their support for the paranormal misattribution hypothesis and the impact of availability biases in the form of both vividness and severity effects. Methodological issues and future research ideas are also discussed. © 2016 The British Psychological Society.
2011-01-01
Background Endothelial function has been shown to be a highly sensitive marker for the overall cardiovascular risk of an individual. Furthermore, there is evidence of important sex differences in endothelial function that may underlie the differential presentation of cardiovascular disease (CVD) in women relative to men. As such, measuring endothelial function may have sex-specific prognostic value for the prediction of CVD events, thus improving risk stratification for the overall prediction of CVD in both men and women. The primary objective of this study is to assess the clinical utility of the forearm hyperaemic reactivity (FHR) test (a proxy measure of endothelial function) for the prediction of CVD events in men vs. women using a novel, noninvasive nuclear medicine -based approach. It is hypothesised that: 1) endothelial dysfunction will be a significant predictor of 5-year CVD events independent of baseline stress test results, clinical, demographic, and psychological variables in both men and women; and 2) endothelial dysfunction will be a better predictor of 5-year CVD events in women compared to men. Methods/Design A total of 1972 patients (812 men and 1160 women) undergoing a dipyridamole stress testing were recruited. Medical history, CVD risk factors, health behaviours, psychological status, and gender identity were assessed via structured interview or self-report questionnaires at baseline. In addition, FHR was assessed, as well as levels of sex hormones via blood draw. Patients will be followed for 5 years to assess major CVD events (cardiac mortality, non-fatal MI, revascularization procedures, and cerebrovascular events). Discussion This is the first study to determine the extent and nature of any sex differences in the ability of endothelial function to predict CVD events. We believe the results of this study will provide data that will better inform the choice of diagnostic tests in men and women and bring the quality of risk stratification in women on par with that of men. PMID:21831309
Assessing EEG sleep spindle propagation. Part 1: theory and proposed methodology.
O'Reilly, Christian; Nielsen, Tore
2014-01-15
A convergence of studies has revealed sleep spindles to be associated with sleep-related cognitive processing and even with fundamental waking state capacities such as intelligence. However, some spindle characteristics, such as propagation direction and delay, may play a decisive role but are only infrequently investigated because of technical complexities. A new methodology for assessing sleep spindle propagation over the human scalp using noninvasive electroencephalography (EEG) is described. This approach is based on the alignment of time-frequency representations of spindle activity across recording channels. This first of a two-part series concentrates on framing theoretical considerations related to EEG spindle propagation and on detailing the methodology. A short example application is provided that illustrates the repeatability of results obtained with the new propagation measure in a sample of 32 night recordings. A more comprehensive experimental investigation is presented in part two of the series. Compared to existing methods, this approach is particularly well adapted for studying the propagation of sleep spindles because it estimates time delays rather than phase synchrony and it computes propagation properties for every individual spindle with windows adjusted to the specific spindle duration. The proposed methodology is effective in tracking the propagation of spindles across the scalp and may thus help in elucidating the temporal aspects of sleep spindle dynamics, as well as other transient EEG and MEG events. A software implementation (the Spyndle Python package) is provided as open source software. Copyright © 2013 Elsevier B.V. All rights reserved.
2012-08-01
radioactive material compares to Chernobyl . In looking at atmospheric releases of Cs- 137, there seems to be agreement that Fukushima releases were about 10...to 20% of those produced by the Chernobyl event. However, the Fukushima event has resulted in significant releases of contaminated water to the...ocean. Also, the Chernobyl releases occurred over about 10 days, while releases from Fukushima continued over a longer period of time. Mr. Tupin
Extreme risk assessment based on normalized historic loss data
NASA Astrophysics Data System (ADS)
Eichner, Jan
2017-04-01
Natural hazard risk assessment and risk management focuses on the expected loss magnitudes of rare and extreme events. Such large-scale loss events typically comprise all aspects of compound events and accumulate losses from multiple sectors (including knock-on effects). Utilizing Munich Re's NatCatSERVICE direct economic loss data, we beriefly recap a novel methodology of peril-specific loss data normalization which improves the stationarity properties of highly non-stationary historic loss data (due to socio-economic growth of assets prone to destructive forces), and perform extreme value analysis (peaks-over-threshold method) to come up with return level estimates of e.g. 100-yr loss event scenarios for various types of perils, globally or per continent, and discuss uncertainty in the results.
Dynamic Event Tree advancements and control logic improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
Post-Structural Methodology at the Quilting Point: Intercultural Encounters.
Gillett, Grant
Lacan's quilting point connects a network of signifiers with the lived world as a place of voices, memory, and adaptation "seen in" the mirror of language. Crossing cultures can obscure the ways we make sense of the world. Some planes of signification, in aiming to be universal in their knowledge (such as the natural sciences), try to track objects and events independent of our thoughts about them and the ways that signifiers may slide past each other. However, cross-structural comparison and the analysis of cross cultural encounters cannot treat its objects of interest that way. Thus we need a theory and methodology that effectively connects the multilayered discourses of subjectivities from diverse cultures and allows triangulation between them in relation to points of shared experience. At such points we need a critical attitude to our own framework and an openness to the uneasy reflective equilibrium that uncovers assumptions and modes of thinking that will hamper us. Quilting points are such points where different discourses converge on a single event or set of events so as to mark "vertical" connections allowing tentative alignments between ways of meaning so that we can begin to build real cross-cultural understanding.
Sanami, T.; Iwamoto, Y.; Kajimoto, T.; ...
2011-12-06
Our methodology for the time-of-flight measurement of the neutron energy spectrum for a high-energy proton-beam-induced reaction was established at the Fermilab Test Beam Facility of the Fermi National Accelerator Laboratory. The 120-GeV proton beam with 3 × 10 5 protons/spill was prepared for event-by-event counting of incident protons and emitted neutrons for time-of-flight energy determination. An NE213 organic liquid scintillator (12.7 cm in diameter by 12.7 cm in length) was employed with a veto plastic scintillator and a pulse-shape discrimination technique to identify neutrons. Raw waveforms of NE213, veto and beam detectors were recorded to discriminate the effects of multi-protonmore » beam events by considering different time windows. The neutron energy spectrum ranging from 10 to 800 MeV was obtained for a 60-cm-long copper target at 90° with respect to the beam axis. Finally our obtained spectrum was consistent with that deduced employing the conventional unfolding technique as well as that obtained in a 40-GeV/c thin-target experiment.« less
Nuclear Forensics Attributing the Source of Spent Fuel Used in an RDD Event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Mark Robert
2005-05-01
An RDD attack against the U.S. is something America needs to prepare against. If such an event occurs the ability to quickly identify the source of the radiological material used in an RDD would aid investigators in identifying the perpetrators. Spent fuel is one of the most dangerous possible radiological sources for an RDD. In this work, a forensics methodology was developed and implemented to attribute spent fuel to a source reactor. The specific attributes determined are the spent fuel burnup, age from discharge, reactor type, and initial fuel enrichment. It is shown that by analyzing the post-event material, thesemore » attributes can be determined with enough accuracy to be useful for investigators. The burnup can be found within a 5% accuracy, enrichment with a 2% accuracy, and age with a 10% accuracy. Reactor type can be determined if specific nuclides are measured. The methodology developed was implemented into a code call NEMASYS. NEMASYS is easy to use and it takes a minimum amount of time to learn its basic functions. It will process data within a few minutes and provide detailed information about the results and conclusions.« less
Computational Electrocardiography: Revisiting Holter ECG Monitoring.
Deserno, Thomas M; Marx, Nikolaus
2016-08-05
Since 1942, when Goldberger introduced the 12-lead electrocardiography (ECG), this diagnostic method has not been changed. After 70 years of technologic developments, we revisit Holter ECG from recording to understanding. A fundamental change is fore-seen towards "computational ECG" (CECG), where continuous monitoring is producing big data volumes that are impossible to be inspected conventionally but require efficient computational methods. We draw parallels between CECG and computational biology, in particular with respect to computed tomography, computed radiology, and computed photography. From that, we identify technology and methodology needed for CECG. Real-time transfer of raw data into meaningful parameters that are tracked over time will allow prediction of serious events, such as sudden cardiac death. Evolved from Holter's technology, portable smartphones with Bluetooth-connected textile-embedded sensors will capture noisy raw data (recording), process meaningful parameters over time (analysis), and transfer them to cloud services for sharing (handling), predicting serious events, and alarming (understanding). To make this happen, the following fields need more research: i) signal processing, ii) cycle decomposition; iii) cycle normalization, iv) cycle modeling, v) clinical parameter computation, vi) physiological modeling, and vii) event prediction. We shall start immediately developing methodology for CECG analysis and understanding.
NASA Astrophysics Data System (ADS)
Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick
2016-06-01
Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.
Comparative effectiveness research methodology using secondary data: A starting user's guide.
Sun, Maxine; Lipsitz, Stuart R
2018-04-01
The use of secondary data, such as claims or administrative data, in comparative effectiveness research has grown tremendously in recent years. We believe that the current review can help investigators relying on secondary data to (1) gain insight into both the methodologies and statistical methods, (2) better understand the necessity of a rigorous planning before initiating a comparative effectiveness investigation, and (3) optimize the quality of their investigations. Specifically, we review concepts of adjusted analyses and confounders, methods of propensity score analyses, and instrumental variable analyses, risk prediction models (logistic and time-to-event), decision-curve analysis, as well as the interpretation of the P value and hypothesis testing. Overall, we hope that the current review article can help research investigators relying on secondary data to perform comparative effectiveness research better understand the necessity of a rigorous planning before study start, and gain better insight in the choice of statistical methods so as to optimize the quality of the research study. Copyright © 2017 Elsevier Inc. All rights reserved.
Lee, Wang Wei; Kukreja, Sunil L.; Thakor, Nitish V.
2017-01-01
This paper presents a neuromorphic tactile encoding methodology that utilizes a temporally precise event-based representation of sensory signals. We introduce a novel concept where touch signals are characterized as patterns of millisecond precise binary events to denote pressure changes. This approach is amenable to a sparse signal representation and enables the extraction of relevant features from thousands of sensing elements with sub-millisecond temporal precision. We also proposed measures adopted from computational neuroscience to study the information content within the spiking representations of artificial tactile signals. Implemented on a state-of-the-art 4096 element tactile sensor array with 5.2 kHz sampling frequency, we demonstrate the classification of transient impact events while utilizing 20 times less communication bandwidth compared to frame based representations. Spiking sensor responses to a large library of contact conditions were also synthesized using finite element simulations, illustrating an 8-fold improvement in information content and a 4-fold reduction in classification latency when millisecond-precise temporal structures are available. Our research represents a significant advance, demonstrating that a neuromorphic spatiotemporal representation of touch is well suited to rapid identification of critical contact events, making it suitable for dynamic tactile sensing in robotic and prosthetic applications. PMID:28197065
Rueger, Sandra Yu; George, Rachel
2017-04-01
Research on adolescent depression has overwhelmingly focused on risk factors, such as stressful negative events and cognitive vulnerabilities, but much important information can be gained by focusing on protective factors. Thus, the current study aimed to broaden understanding on adolescent depression by considering the role of two positive elements as protective factors, attributional style for positive events and self-esteem, in a model of depression. The sample included 491 middle school students (52 % female; n = 249) with an age range from 12 to 15 years (M = 13.2, SD = .70). The sample was ethnically/racially diverse, with 55 % White, 22 % Hispanic, 10 % Asian American, 3 % African American, and 10 % Biracial/Other. Correlational analyses indicated significant cross-sectional and longitudinal associations between an enhancing attributional style (internal, stable, global attributions for positive events), self-esteem and depressive symptoms. Further, prospective analyses using bootstrapping methodology demonstrated significant indirect effects of an enhancing attributional style on decreases in depressive symptoms through its effects on self-esteem. These findings highlight the importance of considering attributional style for positive events as a protective factor in the developmental course of depressive symptoms during early adolescence.
An unjustified benefit: immortal time bias in the analysis of time-dependent events.
Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg
2018-02-01
Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.
Daly, Ella J; Trivedi, Madhukar H; Fava, Maurizio; Shelton, Richard; Wisniewski, Stephen R; Morris, David W; Stegman, Diane; Preskorn, Sheldon H; Rush, A John
2011-02-01
Little is known about the association between antidepressant treatment-emergent adverse events and symptom nonremission in major depressive disorder. The objective of the current analysis was to determine whether particular baseline symptoms or treatment-emergent symptoms (adverse events) during the first 2 weeks are associated with nonremission after 8 weeks of treatment with a selective serotonin reuptake inhibitor (SSRI).Outpatients clinically diagnosed with nonpsychotic major depressive disorder were recruited from 6 primary and 9 psychiatric care sites. Participants (n = 206) were treated with an SSRI antidepressant (citalopram [20-40 mg/d], escitalopram [10-20 mg/d], fluoxetine [20-40 mg/d], paroxetine [20-40 mg/d], paroxetine CR [25-37.5 mg/d], or sertraline [50-150 mg/d]) for 8 weeks. Remission was defined as having a score of 5 or less on the 16-item Quick Inventory of Depressive Symptomatology-Clinician-Rated at week 8, or using last observation carried forward. Adverse events were identified using the 55-item Systematic Assessment for Treatment Emergent Events-Systematic Inquiry completed by participants at baseline and week 2.Findings indicated that the emergence of adverse events of weakness/fatigue, strange feeling, and trouble catching breath/hyperventilation at week 2 were independently associated with lack of remission even after controlling for the potential confounders of baseline depressive severity, anxious symptoms, antidepressant medication, chronic depression, race, burden of general medical comorbidity, and time in study. Hearing/seeing things appeared to have a protective effect. In conclusion, during SSRI treatment, the adverse events of weakness/fatigue, feeling strange, and trouble catching breath/hyperventilation are associated with nonremission, possibly due to lower adherence, early attrition, difficulty increasing the dose, and reduced efficacy.
Estarellas Martin, Carolina; Seira Castan, Constantí; Luque Garriga, F Javier; Bidon-Chanal Badia, Axel
2015-10-01
Residue conformational changes and internal cavity migration processes play a key role in regulating the kinetics of ligand migration and binding events in globins. Molecular dynamics simulations have demonstrated their value in the study of these processes in different haemoglobins, but derivation of kinetic data demands the use of more complex techniques like enhanced sampling molecular dynamics methods. This review discusses the different methodologies that are currently applied to study the ligand migration process in globins and highlight those specially developed to derive kinetic data. Copyright © 2015 Elsevier Ltd. All rights reserved.
[Influence of cumulated sexual trauma on sexual life and relationship of a patient].
Sobański, Jerzy A; Klasa, Katarzyna; Cyranka, Katarzyna; Müldner-Nieckowski, Lukasz; Dembińska, Edyta; Rutkowski, Krzysztof; Smiatek-Mazgaj, Bogna; Mielimaka, Michal
2014-01-01
The assessment of links of accumulated traumatic events of a sexual nature, recollected from the past, with the current functioning of the patients in the area of sexual life and relationship. Comorbidity of memories of traumatic sexual events from childhood and puberty in patients with the features of their current partner relationships and sexual life were analyzed on the basis of Live Inventory completed by 2,582 women and 1,347 men, before treatment in day hospital (years 1980-2002). The accumulation was evaluated for a combination of two or three selected events. The presence of relatively numerous traumatic events in the field of sexuality early initiation or enforced initiation, incest or its attempt, sub-optimal sexual education and punishment for masturbation was indicated. In some patients, these events occurred simultaneously. Especially in women, the presence in the same person of two or three aggravating circumstances of life was associated with a higher risk of among others fleeting, casual sexual contacts, marriage under the influence of environment pressures, reluctance to partner. Increased accumulation - the presence in the same patient of more than one adverse circumstances associated with sexual development - leads to a higher incidence of interference in relationship with a partner including the elements of sexual dysfunction. The obtained results are generally consistent with clinical observations and literature despite different, simplified methodology of the study based on the analysis of single variables from questionnaire interviews. Finding fewer links in the group of men can be explained by their much lower number in the study group and less frequent burdening with certain traumatic events or different experiencing.
NASA Astrophysics Data System (ADS)
Prosdocimi, Massimo; Calligaro, Simone; Sofia, Giulia; Tarolli, Paolo
2015-04-01
Throughout the world, agricultural landscapes assume a great importance, especially for supplying food and a livelihood. Among the land degradation phenomena, erosion processes caused by water are those that may most affect the benefits provided by agricultural lands and endanger people who work and live there. In particular, erosion processes that affect the banks of agricultural channels may cause the bank failure and represent, in this way, a severe threat to floodplain inhabitants and agricultural crops. Similarly, rills and gullies are critical soil erosion processes as well, because they bear upon the productivity of a farm and represent a cost that growers have to deal with. To estimate quantitatively soil losses due to bank erosion and rills processes, area based measurements of surface changes are necessary but, sometimes, they may be difficult to realize. In fact, surface changes due to short-term events have to be represented with fine resolution and their monitoring may entail too much money and time. The main objective of this work is to show the effectiveness of a user-friendly and low-cost technique that may even rely on smart-phones, for the post-event analyses of i) bank erosion affecting agricultural channels, and ii) rill processes occurring on an agricultural plot. Two case studies were selected and located in the Veneto floodplain (northeast Italy) and Marche countryside (central Italy), respectively. The work is based on high-resolution topographic data obtained by the emerging, low-cost photogrammetric method named Structure-from-Motion (SfM). Extensive photosets of the case studies were obtained using both standalone reflex digital cameras and smart-phone built-in cameras. Digital Terrain Models (DTMs) derived from SfM revealed to be effective to estimate quantitatively erosion volumes and, in the case of the bank eroded, deposited materials as well. SfM applied to pictures taken by smartphones is useful for the analysis of the topography and Earth surface processes at very low-cost. This methodology should be of great help for farmers and/or technician who work at Land Reclamation Consortia or at Civil Protection for taking suitable post-event field surveys in support to flood risk and soil management.
Fragility Analysis of Concrete Gravity Dams
NASA Astrophysics Data System (ADS)
Tekie, Paulos B.; Ellingwood, Bruce R.
2002-09-01
Concrete gravity dams are an important part ofthe nation's infrastructure. Many dams have been in service for over 50 years, during which time important advances in the methodologies for evaluation of natural phenomena hazards have caused the design-basis events to be revised upwards, in some cases significantly. Many existing dams fail to meet these revised safety criteria and structural rehabilitation to meet newly revised criteria may be costly and difficult. A probabilistic safety analysis (PSA) provides a rational safety assessment and decision-making tool managing the various sources of uncertainty that may impact dam performance. Fragility analysis, which depicts fl%e uncertainty in the safety margin above specified hazard levels, is a fundamental tool in a PSA. This study presents a methodology for developing fragilities of concrete gravity dams to assess their performance against hydrologic and seismic hazards. Models of varying degree of complexity and sophistication were considered and compared. The methodology is illustrated using the Bluestone Dam on the New River in West Virginia, which was designed in the late 1930's. The hydrologic fragilities showed that the Eluestone Dam is unlikely to become unstable at the revised probable maximum flood (PMF), but it is likely that there will be significant cracking at the heel ofthe dam. On the other hand, the seismic fragility analysis indicated that sliding is likely, if the dam were to be subjected to a maximum credible earthquake (MCE). Moreover, there will likely be tensile cracking at the neck of the dam at this level of seismic excitation. Probabilities of relatively severe limit states appear to be only marginally affected by extremely rare events (e.g. the PMF and MCE). Moreover, the risks posed by the extreme floods and earthquakes were not balanced for the Bluestone Dam, with seismic hazard posing a relatively higher risk.
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
NASA Astrophysics Data System (ADS)
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA
Using soft systems methodology to develop a simulation of out-patient services.
Lehaney, B; Paul, R J
1994-10-01
Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.
NASA Astrophysics Data System (ADS)
Alzbutas, Robertas
2015-04-01
In general, the Emergency Planning Zones (EPZ) are defined as well as plant site and arrangement structures are designed to minimize the potential for natural and manmade hazards external to the plant from affecting the plant safety related functions, which can affect nearby population and environment. This may include consideration of extreme winds, fires, flooding, aircraft crash, seismic activity, etc. Thus the design basis for plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. It has been observed that the Probabilistic Safety Assessment (PSA) methodologies to deal with EPZ and extreme external events have not reached the same level of maturity as for severe internal events. The design basis for any plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. As a prime example of an advanced reactor and new Nuclear Power Plant (NPP) with enhanced safety, the International Reactor Innovative and Secure (IRIS) and Site selection for New NPP in Lithuania had been considered in this work. In the used Safety-by-Design™ approach, the PSA played obviously a key role; therefore a Preliminary IRIS PSA had been developed along with the design. For the design and pre-licensing process of IRIS the external events analysis included both qualitative evaluation and quantitative assessment. As a result of preliminary qualitative analyses, the external events that were chosen for more detailed quantitative scoping evaluation were high winds and tornadoes, aircraft crash, and seismic events. For the site selection in Lithuania a detail site evaluation process was performed and related to the EPZ and risk zoning considerations. In general, applying the quantitative assessment, bounding site characteristics could be used in order to optimize potential redefinition or future restrictions on plant siting and risk zoning. It must be noticed that the use of existing regulations and installations as the basis for this redefinition will not in any way impact the high degree of conservatism inherent in current regulations. Moreover, the remapping process makes this methodology partially independent from the uncertainties still affecting probabilistic techniques. Notwithstanding these considerations, it is still expected that applying this methodology to advanced plant designs with improved safety features will allow significant changes in the emergency planning requirements, and specifically the size of the EPZ. In particular, in the case of IRIS it is expected that taking full credit of the Safety-by-Design™ approach of the IRIS reactor will allow a dramatic changes in the EPZ, while still maintaining a level of protection to the public fully consistent with existing regulations.
Takada, Mitsutaka; Fujimoto, Mai; Motomura, Haruka; Hosomi, Kouichi
2016-01-01
Voltage-gated sodium channels (VGSCs) are drug targets for the treatment of epilepsy. Recently, a decreased risk of cancer associated with sodium channel-blocking antiepileptic drugs (AEDs) has become a research focus of interest. The purpose of this study was to test the hypothesis that the use of sodium channel-blocking AEDs are inversely associated with cancer, using different methodologies, algorithms, and databases. A total of 65,146,507 drug-reaction pairs from the first quarter of 2004 through the end of 2013 were downloaded from the US Food and Drug Administration Adverse Event Reporting System. The reporting odds ratio (ROR) and information component (IC) were used to detect an inverse association between AEDs and cancer. Upper limits of the 95% confidence interval (CI) of < 1 and < 0 for the ROR and IC, respectively, signified inverse associations. Furthermore, using a claims database, which contains 3 million insured persons, an event sequence symmetry analysis (ESSA) was performed to identify an inverse association between AEDs and cancer over the period of January 2005 to May 2014. The upper limit of the 95% CI of adjusted sequence ratio (ASR) < 1 signified an inverse association. In the FAERS database analyses, significant inverse associations were found between sodium channel-blocking AEDs and individual cancers. In the claims database analyses, sodium channel-blocking AED use was inversely associated with diagnoses of colorectal cancer, lung cancer, gastric cancer, and hematological malignancies, with ASRs of 0.72 (95% CI: 0.60 - 0.86), 0.65 (0.51 - 0.81), 0.80 (0.65 - 0.98), and 0.50 (0.37 - 0.66), respectively. Positive associations between sodium channel-blocking AEDs and cancer were not found in the study. Multi-methodological approaches using different methodologies, algorithms, and databases suggest that sodium channel-blocking AED use is inversely associated with colorectal cancer, lung cancer, gastric cancer, and hematological malignancies.
Power, Robert A; Cohen-Woods, Sarah; Ng, Mandy Y; Butler, Amy W; Craddock, Nick; Korszun, Ania; Jones, Lisa; Jones, Ian; Gill, Michael; Rice, John P; Maier, Wolfgang; Zobel, Astrid; Mors, Ole; Placentino, Anna; Rietschel, Marcella; Aitchison, Katherine J; Tozzi, Federica; Muglia, Pierandrea; Breen, Gerome; Farmer, Anne E; McGuffin, Peter; Lewis, Cathryn M; Uher, Rudolf
2013-09-01
Stressful life events are an established trigger for depression and may contribute to the heterogeneity within genome-wide association analyses. With depression cases showing an excess of exposure to stressful events compared to controls, there is difficulty in distinguishing between "true" cases and a "normal" response to a stressful environment. This potential contamination of cases, and that from genetically at risk controls that have not yet experienced environmental triggers for onset, may reduce the power of studies to detect causal variants. In the RADIANT sample of 3,690 European individuals, we used propensity score matching to pair cases and controls on exposure to stressful life events. In 805 case-control pairs matched on stressful life event, we tested the influence of 457,670 common genetic variants on the propensity to depression under comparable level of adversity with a sign test. While this analysis produced no significant findings after genome-wide correction for multiple testing, we outline a novel methodology and perspective for providing environmental context in genetic studies. We recommend contextualizing depression by incorporating environmental exposure into genome-wide analyses as a complementary approach to testing gene-environment interactions. Possible explanations for negative findings include a lack of statistical power due to small sample size and conditional effects, resulting from the low rate of adequate matching. Our findings underscore the importance of collecting information on environmental risk factors in studies of depression and other complex phenotypes, so that sufficient sample sizes are available to investigate their effect in genome-wide association analysis. Copyright © 2013 Wiley Periodicals, Inc.
Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)
NASA Astrophysics Data System (ADS)
Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.
2013-12-01
We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.
Pernot-Marino, Elodie; Danion, Jean-Marie; Hedelin, Guy
2004-08-01
Conscious recollection for autobiographical memory is the subjective experience of reliving a personal event mentally. Its frequency is strongly influenced by the emotion experienced at the time of the event. We addressed the issue of whether conscious recollection for autobiographical memories is also influenced by the emotion experienced at the time of retrieval. We used lorazepam, a benzodiazepine, as a pharmacological tool to modulate this emotional experience. Autobiographical memories were recorded in eight healthy volunteers using a diary study methodology. Each day, four entries were made by each subject: two true events, one altered event and one false event. For each event, the subjects were asked to rate emotional variables at encoding and at retrieval. Two months later, there were two sessions of recognition tests during which the subjects received orally an acute administration of either lorazepam (0.038 mg/kg) or placebo using a cross-over design. Subjective states of awareness were assessed using the Remember/Know/Guess procedure. Compared to placebo, lorazepam increased levels of conscious recollection, as assessed by Remember responses, for both true and false memories and induced an overestimation of the personal significance and emotional intensity of past events. Structural equation modelling showed that this overestimation was causal in the increased frequency of conscious recollection. Our results provide experimental evidence that the frequency of conscious recollection for both true and false autobiographical memories is influenced by the emotion experienced at the time of retrieval.
NASA Astrophysics Data System (ADS)
Zhu, Wei; Udalski, A.; Calchi Novati, S.; Chung, S.-J.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Gould, A.; Lee, C.-U.; Albrow, M. D.; Yee, J. C.; Han, C.; Hwang, K.-H.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Kim, Y.-H.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration; Poleski, R.; Mróz, P.; Pietrukowicz, P.; Skowron, J.; Szymański, M. K.; KozLowski, S.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Wibking, B.; Spitzer Team
2017-11-01
We analyze an ensemble of microlensing events from the 2015 Spitzer microlensing campaign, all of which were densely monitored by ground-based high-cadence survey teams. The simultaneous observations from Spitzer and the ground yield measurements of the microlensing parallax vector {{\\boldsymbol{π }}}{{E}}, from which compact constraints on the microlens properties are derived, including ≲25% uncertainties on the lens mass and distance. With the current sample, we demonstrate that the majority of microlenses are indeed in the mass range of M dwarfs. The planet sensitivities of all 41 events in the sample are calculated, from which we provide constraints on the planet distribution function. In particular, assuming a planet distribution function that is uniform in {log}q, where q is the planet-to-star mass ratio, we find a 95% upper limit on the fraction of stars that host typical microlensing planets of 49%, which is consistent with previous studies. Based on this planet-free sample, we develop the methodology to statistically study the Galactic distribution of planets using microlensing parallax measurements. Under the assumption that the planet distributions are the same in the bulge as in the disk, we predict that ∼1/3 of all planet detections from the microlensing campaigns with Spitzer should be in the bulge. This prediction will be tested with a much larger sample, and deviations from it can be used to constrain the abundance of planets in the bulge relative to the disk.
NASA Astrophysics Data System (ADS)
Bedrina, T.; Parodi, A.; Quarati, A.; Clematis, A.; Rebora, N.; Laiosa, D.
2012-04-01
One of the critical issues in Hydro-Meteorological Research (HMR) is a better exploitation of data archives according to a multidisciplinary perspective. Different Earth science databases offer a huge amount of observational data, which often need to be assembled, processed, combined accordingly HM scientists needs. The cooperation between scientists active in HMR and Information and Communication Technologies (ICT) is essential in the development of innovative tools and applications for manipulating, aggregating and re-arranging heterogeneous information in flexible way. In this paper it is described an application devoted to the collection and integration of HM datasets, originated by public or private sources, freely exposed via Web services API. This application uses the mashup, recently become very popular in many fields, (Chow S.-W., 2007) technology concepts. Such methodology means combination of data and/or programs published by external online sources into an integrated experience. Mashup seems to be a promising methodology to respond to the multiple data-related activities into which HM researchers are daily involved (e.g. finding and retrieving high volume data; learning formats and developing readers; extracting parameters; performing filtering and mask; developing analysis and visualization tools). The specific case study of the recent extreme rainfall event, occurred over Genoa in Italy on the 4th November 2011 is shown through the integration of semi-professional weather observational networks as free available data source in addition to official weather networks.
Abstracting event-based control models for high autonomy systems
NASA Technical Reports Server (NTRS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1993-01-01
A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.
Sen, Novonil; Kundu, Tribikram
2018-07-01
Estimating the location of an acoustic source in a structure is an important step towards passive structural health monitoring. Techniques for localizing an acoustic source in isotropic structures are well developed in the literature. Development of similar techniques for anisotropic structures, however, has gained attention only in the recent years and has a scope of further improvement. Most of the existing techniques for anisotropic structures either assume a straight line wave propagation path between the source and an ultrasonic sensor or require the material properties to be known. This study considers different shapes of the wave front generated during an acoustic event and develops a methodology to localize the acoustic source in an anisotropic plate from those wave front shapes. An elliptical wave front shape-based technique was developed first, followed by the development of a parametric curve-based technique for non-elliptical wave front shapes. The source coordinates are obtained by minimizing an objective function. The proposed methodology does not assume a straight line wave propagation path and can predict the source location without any knowledge of the elastic properties of the material. A numerical study presented here illustrates how the proposed methodology can accurately estimate the source coordinates. Copyright © 2018 Elsevier B.V. All rights reserved.
Economic evaluation in patient safety: a literature review of methods.
de Rezende, Bruna Alves; Or, Zeynep; Com-Ruelle, Laure; Michel, Philippe
2012-06-01
Patient safety practices, targeting organisational changes for improving patient safety, are implemented worldwide but their costs are rarely evaluated. This paper provides a review of the methods used in economic evaluation of such practices. International medical and economics databases were searched for peer-reviewed publications on economic evaluations of patient safety between 2000 and 2010 in English and French. This was complemented by a manual search of the reference lists of relevant papers. Grey literature was excluded. Studies were described using a standardised template and assessed independently by two researchers according to six quality criteria. 33 articles were reviewed that were representative of different patient safety domains, data types and evaluation methods. 18 estimated the economic burden of adverse events, 3 measured the costs of patient safety practices and 12 provided complete economic evaluations. Healthcare-associated infections were the most common subject of evaluation, followed by medication-related errors and all types of adverse events. Of these, 10 were selected that had adequately fulfilled one or several key quality criteria for illustration. This review shows that full cost-benefit/utility evaluations are rarely completed as they are resource intensive and often require unavailable data; some overcome these difficulties by performing stochastic modelling and by using secondary sources. Low methodological transparency can be a problem for building evidence from available economic evaluations. Investing in the economic design and reporting of studies with more emphasis on defining study perspectives, data collection and methodological choices could be helpful for strengthening our knowledge base on practices for improving patient safety.
Penedones, Ana; Mendes, Diogo; Alves, Carlos; Batel Marques, Francisco
2014-11-01
The present study evaluates the safety of the biologics approved for the treatment of ocular diseases. The European medicines agency Website was searched to identify biologics with approved ophthalmologic therapeutic indications. A systematic search was performed using MEDLINE, the Cochrane Central Register of Controlled Trials (CENTRAL) and the International Clinical Trials Registry Platform up to December 2013. Pre-marketing, phase III randomized controlled trials (RCT), postmarketing clinical trials, observational longitudinal studies, and case reports involving adverse events (AE) were included. Methodological quality was assessed by Downs & Black checklist. All European spontaneous reports of AE included in the Eudravigilance up to December 2013 were also considered. AE were classified as ocular (related and non-related with the injection procedure) and non-ocular (related or non-related with vascular endothelial growth factor inhibition). Incidences of all reported AEs were estimated. Pegaptanib, ranibizumab, and aflibercept were identified as ophthalmic biologics. Fourteen premarketing RCT, 7 postmarketing clinical trials, 31 observational studies, along with 31 case reports and 7,720 spontaneous reports were identified and included in this study. Both in pre- and postmarketing settings, ocular AEs were more frequent than non-ocular AEs. Premarketing safety data inform the most common AEs. Postmarketing studies suggest an increased number of events such as retinal pigmented epithelium tears (0.6%-24%), thromboembolic events (0.8%-5%), and mortality (2.8%-4%). This study highlights the need to properly evaluate the risk for rare, serious, and long-term AEs, such as thromboembolic events, since they can lead to imbalances in the benefit-risk ratio of biologics in ophthalmology.
NASA Astrophysics Data System (ADS)
Papagiannaki, K.; Lagouvardos, K.; Kotroni, V.; Papagiannakis, G.
2014-01-01
The objective of this study is to analyze frost damaging events in agriculture, by examining the relationship between the daily minimum temperature at the lower atmosphere (at the pressure level of 850 hPa) and crop production losses. Furthermore, the study suggests a methodological approach for estimating agriculture risk due to frost events, with the aim to estimate the short-term probability and magnitude of frost-related financial losses for different levels of 850 hPa temperature. Compared with near surface temperature forecasts, temperature forecast at the level of 850 hPa is less influenced by varying weather conditions, as well as by local topographical features, thus it constitutes a more consistent indicator of the forthcoming weather conditions. The analysis of the daily monetary compensations for insured crop losses caused by weather events in Greece, during the period 1999-2011, shows that frost is the major meteorological phenomenon with adverse effects on crop productivity in the largest part of the country. Two regions of different geographical latitude are further examined, to account for the differences in the temperature ranges developed within their ecological environment. Using a series of linear and logistic regressions, we found that minimum temperature (at 850 hPa level), grouped in three categories according to its magnitude, and seasonality are significant variables when trying to explain crop damage costs, as well as to predict and quantify the likelihood and magnitude of frost damaging events.
NASA Astrophysics Data System (ADS)
Papagiannaki, K.; Lagouvardos, K.; Kotroni, V.; Papagiannakis, G.
2014-09-01
The objective of this study is the analysis of damaging frost events in agriculture, by examining the relationship between the daily minimum temperature in the lower atmosphere (at an isobaric level of 850 hPa) and crop production losses. Furthermore, the study suggests a methodological approach for estimating agriculture risk due to frost events, with the aim of estimating the short-term probability and magnitude of frost-related financial losses for different levels of 850 hPa temperature. Compared with near-surface temperature forecasts, temperature forecasts at the level of 850 hPa are less influenced by varying weather conditions or by local topographical features; thus, they constitute a more consistent indicator of the forthcoming weather conditions. The analysis of the daily monetary compensations for insured crop losses caused by weather events in Greece shows that, during the period 1999-2011, frost caused more damage to crop production than any other meteorological phenomenon. Two regions of different geographical latitudes are examined further, to account for the differences in the temperature ranges developed within their ecological environment. Using a series of linear and logistic regressions, we found that minimum temperature (at an 850 hPa level), grouped into three categories according to its magnitude, and seasonality, are significant variables when trying to explain crop damage costs, as well as to predict and quantify the likelihood and magnitude of damaging frost events.
Green, Andrew; Liles, Clive; Rushton, Alison; Kyte, Derek G
2014-12-01
This systematic review investigated the measurement properties of disease-specific patient-reported outcome measures used in Patellofemoral Pain Syndrome. Two independent reviewers conducted a systematic search of key databases (MEDLINE, EMBASE, AMED, CINHAL+ and the Cochrane Library from inception to August 2013) to identify relevant studies. A third reviewer mediated in the event of disagreement. Methodological quality was evaluated using the validated COSMIN (Consensus-based Standards for the Selection of Health Measurement Instruments) tool. Data synthesis across studies determined the level of evidence for each patient-reported outcome measure. The search strategy returned 2177 citations. Following the eligibility review phase, seven studies, evaluating twelve different patient-reported outcome measures, met inclusion criteria. A 'moderate' level of evidence supported the structural validity of several measures: the Flandry Questionnaire, Anterior Knee Pain Scale, Functional Index Questionnaire, Eng and Pierrynowski Questionnaire and Visual Analogue Scales for 'usual' and 'worst' pain. In addition, there was a 'Limited' level of evidence supporting the test-retest reliability and validity (cross-cultural, hypothesis testing) of the Persian version of the Anterior Knee Pain Scale. Other measurement properties were evaluated with poor methodological quality, and many properties were not evaluated in any of the included papers. Current disease-specific outcome measures for Patellofemoral Pain Syndrome require further investigation. Future studies should evaluate all important measurement properties, utilising an appropriate framework such as COSMIN to guide study design, to facilitate optimal methodological quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Prediction of road accidents: A Bayesian hierarchical approach.
Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H
2013-03-01
In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any road network provided that the required data are available. Copyright © 2012 Elsevier Ltd. All rights reserved.
Personality Strengths as Resilience: A One-Year Multiwave Study.
Goodman, Fallon R; Disabato, David J; Kashdan, Todd B; Machell, Kyla A
2017-06-01
We examined how personality strengths prospectively predict reactions to negative life events. Participants were 797 community adults from 42 countries. At five points over the course of 1 year, participants completed a series of questionnaires measuring seven personality strengths (hope, grit, meaning in life, curiosity, gratitude, control beliefs, and use of strengths), subjective well-being, and frequency and severity of negative life events. Using hierarchical linear modeling with assessment periods nested within participants, results from lagged analyses found that only hope emerged as a resilience factor. To illustrate the importance of using appropriate lagged analyses in resilience research, we ran nonlagged analyses; these results suggest that all seven personality strengths moderated the effect of negative life events on subjective well-being, with greater strengths associated with healthier outcomes. To provide evidence that personality strengths confer resilience, a prospective examination is needed with the inclusion of events and responses to them. The use of concurrent methodologies and analyses, which is the norm in psychology, often leads to erroneous conclusions. Hope, the ability to generate routes to reach goals and the motivation to use those routes, was shown to be particularly important in promoting resilience. © 2016 Wiley Periodicals, Inc.
Osis, Sean T; Hettinga, Blayne A; Ferber, Reed
2016-05-01
An ongoing challenge in the application of gait analysis to clinical settings is the standardized detection of temporal events, with unobtrusive and cost-effective equipment, for a wide range of gait types. The purpose of the current study was to investigate a targeted machine learning approach for the prediction of timing for foot strike (or initial contact) and toe-off, using only kinematics for walking, forefoot running, and heel-toe running. Data were categorized by gait type and split into a training set (∼30%) and a validation set (∼70%). A principal component analysis was performed, and separate linear models were trained and validated for foot strike and toe-off, using ground reaction force data as a gold-standard for event timing. Results indicate the model predicted both foot strike and toe-off timing to within 20ms of the gold-standard for more than 95% of cases in walking and running gaits. The machine learning approach continues to provide robust timing predictions for clinical use, and may offer a flexible methodology to handle new events and gait types. Copyright © 2016 Elsevier B.V. All rights reserved.
Developing authentic problems through lived experiences in nature
NASA Astrophysics Data System (ADS)
Gürel, Zeynep
2017-02-01
This study's main objective is to develop a theoretical and ontological basis for experimentation in contact with the real life, oriented to physics education. Physics is built upon the observation of nature, where our experience provides opportunity to deal with science in natural environment to those learners who have background in the very basics and essentials of physics. Physics in Nature course includes visiting and camping experiences situated in nature and organizing camp with educational purposes. The course has been integrated with indoor and outdoor settings interactively and the authentic problems, which have been taken from outdoor settings, have been brought into the class without well-defined structure (Ill-structured problems). Within the period of ten years, there were plethora of events and problems that would provide sufficient material for many researchers. Because every problem is an event and has a story. The philosophical event concept of Deleuze and Guattari has been used in the events of Physics in Nature courses. Post-qualitative research methodology has been used in order to put forward how to construct the relation between physics and nature and become the main problem in the physics in nature, thereby it has been the basis of the course and our academic research
Calibrated Methodology for Assessing Adaptation Costs for Urban Drainage Systems
Changes in precipitation patterns associated with climate change may pose significant challenges for storm water management systems across much of the U.S. In particular, adapting these systems to more intense rainfall events will require significant investment. The assessment ...
Pagán, Josué; Risco-Martín, José L; Moya, José M; Ayala, José L
2016-08-01
Prediction of symptomatic crises in chronic diseases allows to take decisions before the symptoms occur, such as the intake of drugs to avoid the symptoms or the activation of medical alarms. The prediction horizon is in this case an important parameter in order to fulfill the pharmacokinetics of medications, or the time response of medical services. This paper presents a study about the prediction limits of a chronic disease with symptomatic crises: the migraine. For that purpose, this work develops a methodology to build predictive migraine models and to improve these predictions beyond the limits of the initial models. The maximum prediction horizon is analyzed, and its dependency on the selected features is studied. A strategy for model selection is proposed to tackle the trade off between conservative but robust predictive models, with respect to less accurate predictions with higher horizons. The obtained results show a prediction horizon close to 40min, which is in the time range of the drug pharmacokinetics. Experiments have been performed in a realistic scenario where input data have been acquired in an ambulatory clinical study by the deployment of a non-intrusive Wireless Body Sensor Network. Our results provide an effective methodology for the selection of the future horizon in the development of prediction algorithms for diseases experiencing symptomatic crises. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lajeunesse, E.; Delacourt, C.; Allemand, P.; Limare, A.; Dessert, C.; Ammann, J.; Grandjean, P.
2010-12-01
A series of recent works have underlined that the flux of material exported outside of a watershed is dramatically increased during extreme climatic events, such as storms, tropical cyclones and hurricanes [Dadson et al., 2003 and 2004; Hilton et al., 2008]. Indeed the exceptionally high rainfall rates reached during these events trigger runoff and landsliding which destabilize slopes and accumulate a significant amount of sediments in flooded rivers. This observation raises the question of the control that extreme climatic events might exert on the denudation rate and the morphology of watersheds. Addressing this questions requires to measure sediment transport in flooded rivers. However most conventional sediment monitoring technics rely on manned operated measurements which cannot be performed during extreme climatic events. Monitoring riverine sediment transport during extreme climatic events remains therefore a challenging issue because of the lack of instruments and methodologies adapted to such extreme conditions. In this paper, we present a new methodology aimed at estimating the impact of extreme events on sediment transport in rivers. Our approach relies on the development of two instruments. The first one is an in-situ optical instrument, based on a LISST-25X sensor, capable of measuring both the water level and the concentration of suspended matter in rivers with a time step going from one measurement every hour at low flow to one measurement every 2 minutes during a flood. The second instrument is a remote controlled drone helicopter used to acquire high resolution stereophotogrammetric images of river beds used to compute DEMs and to estimate how flash floods impact the granulometry and the morphology of the river. These two instruments were developed and tested during a 1.5 years field survey performed from june 2007 to january 2009 on the Capesterre river located on Basse-Terre island (Guadeloupe archipelago, Lesser Antilles Arc).
NASA Astrophysics Data System (ADS)
Guillod, B. P.; Massey, N.; Otto, F. E. L.; Allen, M. R.; Jones, R.; Hall, J. W.
2016-12-01
Extreme events being rare by definition, accurately quantifying the probabilities associated with a given event is difficult. This is particularly true for droughts, for which only few events are available in the observational record owing to their long-lasting characteristics. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying present and future risks associated with droughts in the UK. To do so, a large number of modelled weather time series for "synthetic" drought events are being fed into hydrological and impact models to assess their impacts on various sectors (social sciences, economy, industry, agriculture, and ecosystems). Here, we present and analyse the hydro-meteorological drought event sets that have been produced with a new version of weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model simulations, downscaled at 25km over Europe by a nested Regional Climate Model. Simulations include the past 100 years as well as two future time slices (2030s and 2080s), and provide a large number of sequences of spatio-temporally coherent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. Beside presenting the methodology and validation of the event sets, we provide insights into drought risk in the UK and the drivers of drought. In particular, we examine their sensitivity to sea surface temperature and sea ice patterns, both in the recent past and for future projections. How drought risk in the UK can be expected to change in the future will also be discussed. Finally, we assess the applicability of this methodology to other regions. Reference: [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
Nelson, Leif D; Simmons, Joseph; Simonsohn, Uri
2018-01-04
In 2010-2012, a few largely coincidental events led experimental psychologists to realize that their approach to collecting, analyzing, and reporting data made it too easy to publish false-positive findings. This sparked a period of methodological reflection that we review here and call Psychology's Renaissance. We begin by describing how psychologists' concerns with publication bias shifted from worrying about file-drawered studies to worrying about p-hacked analyses. We then review the methodological changes that psychologists have proposed and, in some cases, embraced. In describing how the renaissance has unfolded, we attempt to describe different points of view fairly but not neutrally, so as to identify the most promising paths forward. In so doing, we champion disclosure and preregistration, express skepticism about most statistical solutions to publication bias, take positions on the analysis and interpretation of replication failures, and contend that meta-analytical thinking increases the prevalence of false positives. Our general thesis is that the scientific practices of experimental psychologists have improved dramatically.
Lannert, Brittany K
2015-07-01
Vicarious traumatization of nonvictim members of communities targeted by bias crimes has been suggested by previous qualitative studies and often dominates public discussion following bias events, but proximal and distal responses of community members have yet to be comprehensively modeled, and quantitative research on vicarious responses is scarce. This comprehensive review integrates theoretical and empirical literatures in social, clinical, and physiological psychology in the development of a model of affective, cognitive, and physiological responses of lesbian, gay, and bisexual individuals upon exposure to information about bias crimes. Extant qualitative research in vicarious response to bias crimes is reviewed in light of theoretical implications and methodological limitations. Potential pathways to mental health outcomes are outlined, including accumulative effects of anticipatory defensive responding, multiplicative effects of minority stress, and putative traumatogenic physiological and cognitive processes of threat. Methodological considerations, future research directions, and clinical implications are also discussed. © The Author(s) 2014.
Rancière, Fanny; Bougas, Nicolas; Viola, Malika; Momas, Isabelle
2017-04-01
The relation between traffic-related air pollution (TRAP) exposure and the incidence of asthma/allergy in preschool children has been widely studied, but results remain heterogeneous, possibly due to differences in methodology and susceptibility to TRAP. We aimed to study the relation of early TRAP exposure with the development of respiratory/allergic symptoms and asthma during preschool years, and to investigate parental allergy, "stressful" family events, and sex as possible effect modifiers. We examined data of 2,015 children from the PARIS birth cohort followed up with repeated questionnaires completed by parents until age 4 years. TRAP exposure in each child's first year of life was estimated by nitrogen oxides (NO x ) air dispersion modeling, taking into account both home and day care locations. Association between TRAP exposure and patterns of wheezing, dry night cough, and rhinitis symptoms was studied using multinomial logistic regression models adjusted for potential confounders. Effect modification by parental history of allergy, stressful family events, and sex was investigated. An interquartile range (26 μg/m 3 ) increase in NO x levels was associated with an increased odds ratio (OR) of persistent wheezing at 4 years (adjusted OR = 1.27; 95% confidence interval: 1.09, 1.47). TRAP exposure was positively associated with persistent wheeze, dry cough, and rhinitis symptoms among children with a parental allergy, those experiencing stressful family events, and boys, but not in children whose parents did not have allergies or experience stressful events, or in girls (all interaction p -values < 0.2). This study supports the hypothesis that not all preschool children are equal regarding TRAP health effects. Parental history of allergy, stressful family events, and male sex may increase their susceptibility to adverse respiratory effects of early TRAP exposure.
Early Detection of Risk Taking in Groups and Individuals
2016-03-25
such methods to Twitter messages from a range of social events, some where disruption is present and others where it is not. The methodologies focus...project applies such methods to Twitter messages from a range of social events, some where...come from real-‐time social media in the form of Twitter messages created during differing social