Sample records for event processing framework

  1. Study of a Fine Grained Threaded Framework Design

    NASA Astrophysics Data System (ADS)

    Jones, C. D.

    2012-12-01

    Traditionally, HEP experiments exploit the multiple cores in a CPU by having each core process one event. However, future PC designs are expected to use CPUs which double the number of processing cores at the same rate as the cost of memory falls by a factor of two. This effectively means the amount of memory per processing core will remain constant. This is a major challenge for LHC processing frameworks since the LHC is expected to deliver more complex events (e.g. greater pileup events) in the coming years while the LHC experiment's frameworks are already memory constrained. Therefore in the not so distant future we may need to be able to efficiently use multiple cores to process one event. In this presentation we will discuss a design for an HEP processing framework which can allow very fine grained parallelization within one event as well as supporting processing multiple events simultaneously while minimizing the memory footprint of the job. The design is built around the libdispatch framework created by Apple Inc. (a port for Linux is available) whose central concept is the use of task queues. This design also accommodates the reality that not all code will be thread safe and therefore allows one to easily mark modules or sub parts of modules as being thread unsafe. In addition, the design efficiently handles the requirement that events in one run must all be processed before starting to process events from a different run. After explaining the design we will provide measurements from simulating different processing scenarios where the processing times used for the simulation are drawn from processing times measured from actual CMS event processing.

  2. CMS event processing multi-core efficiency status

    NASA Astrophysics Data System (ADS)

    Jones, C. D.; CMS Collaboration

    2017-10-01

    In 2015, CMS was the first LHC experiment to begin using a multi-threaded framework for doing event processing. This new framework utilizes Intel’s Thread Building Block library to manage concurrency via a task based processing model. During the 2015 LHC run period, CMS only ran reconstruction jobs using multiple threads because only those jobs were sufficiently thread efficient. Recent work now allows simulation and digitization to be thread efficient. In addition, during 2015 the multi-threaded framework could run events in parallel but could only use one thread per event. Work done in 2016 now allows multiple threads to be used while processing one event. In this presentation we will show how these recent changes have improved CMS’s overall threading and memory efficiency and we will discuss work to be done to further increase those efficiencies.

  3. A generalized framework for nucleosynthesis calculations

    NASA Astrophysics Data System (ADS)

    Sprouse, Trevor; Mumpower, Matthew; Aprahamian, Ani

    2014-09-01

    Simulating astrophysical events is a difficult process, requiring a detailed pairing of knowledge from both astrophysics and nuclear physics. Astrophysics guides the thermodynamic evolution of an astrophysical event. We present a nucleosynthesis framework written in Fortran that combines as inputs a thermodynamic evolution and nuclear data to time evolve the abundances of nuclear species. Through our coding practices, we have emphasized the applicability of our framework to any astrophysical event, including those involving nuclear fission. Because these calculations are often very complicated, our framework dynamically optimizes itself based on the conditions at each time step in order to greatly minimize total computation time. To highlight the power of this new approach, we demonstrate the use of our framework to simulate both Big Bang nucleosynthesis and r-process nucleosynthesis with speeds competitive with current solutions dedicated to either process alone.

  4. Multi-threaded Event Processing with DANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Lawrence; Elliott Wolin

    2007-05-14

    The C++ data analysis framework DANA has been written to support the next generation of Nuclear Physics experiments at Jefferson Lab commensurate with the anticipated 12GeV upgrade. The DANA framework was designed to allow multi-threaded event processing with a minimal impact on developers of reconstruction software. This document describes how DANA implements multi-threaded event processing and compares it to simply running multiple instances of a program. Also presented are relative reconstruction rates for Pentium4, Xeon, and Opteron based machines.

  5. An algebra of discrete event processes

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Meyer, George

    1991-01-01

    This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.

  6. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    NASA Astrophysics Data System (ADS)

    Calafiura, Paolo; Leggett, Charles; Seuster, Rolf; Tsulaia, Vakhtang; Van Gemmeren, Peter

    2015-12-01

    AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write mechanisms, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows the running of AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the diversity of ATLAS event processing workloads on various computing resources: Grid, opportunistic resources and HPC.

  7. AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading

    NASA Astrophysics Data System (ADS)

    Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration

    2017-10-01

    ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.

  8. Human-assisted sound event recognition for home service robots.

    PubMed

    Do, Ha Manh; Sheng, Weihua; Liu, Meiqin

    This paper proposes and implements an open framework of active auditory learning for a home service robot to serve the elderly living alone at home. The framework was developed to realize the various auditory perception capabilities while enabling a remote human operator to involve in the sound event recognition process for elderly care. The home service robot is able to estimate the sound source position and collaborate with the human operator in sound event recognition while protecting the privacy of the elderly. Our experimental results validated the proposed framework and evaluated auditory perception capabilities and human-robot collaboration in sound event recognition.

  9. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  10. Optimizing SIEM Throughput on the Cloud Using Parallelization.

    PubMed

    Alam, Masoom; Ihsan, Asif; Khan, Muazzam A; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, Muhammad Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage.

  11. A research framework for pharmacovigilance in health social media: Identification and evaluation of patient adverse drug event reports.

    PubMed

    Liu, Xiao; Chen, Hsinchun

    2015-12-01

    Social media offer insights of patients' medical problems such as drug side effects and treatment failures. Patient reports of adverse drug events from social media have great potential to improve current practice of pharmacovigilance. However, extracting patient adverse drug event reports from social media continues to be an important challenge for health informatics research. In this study, we develop a research framework with advanced natural language processing techniques for integrated and high-performance patient reported adverse drug event extraction. The framework consists of medical entity extraction for recognizing patient discussions of drug and events, adverse drug event extraction with shortest dependency path kernel based statistical learning method and semantic filtering with information from medical knowledge bases, and report source classification to tease out noise. To evaluate the proposed framework, a series of experiments were conducted on a test bed encompassing about postings from major diabetes and heart disease forums in the United States. The results reveal that each component of the framework significantly contributes to its overall effectiveness. Our framework significantly outperforms prior work. Published by Elsevier Inc.

  12. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  13. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  14. [On-line processing mechanisms in text comprehension: a theoretical review on constructing situation models].

    PubMed

    Iseki, Ryuta

    2004-12-01

    This article reviewed research on construction of situation models during reading. To position variety of research in overall process appropriately, an unitary framework was devised in terms of three theories for on-line processing: resonance process, event-indexing model, and constructionist theory. Resonance process was treated as a basic activation mechanism in the framework. Event-indexing model was regarded as a screening system which selected and encoded activated information in situation models along with situational dimensions. Constructionist theory was considered to have a supervisory role based on coherence and explanation. From a view of the unitary framework, some problems concerning each theory were examined and possible interpretations were given. Finally, it was pointed out that there were little theoretical arguments on associative processing at global level and encoding text- and inference-information into long-term memory.

  15. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  16. Optimizing SIEM Throughput on the Cloud Using Parallelization

    PubMed Central

    Alam, Masoom; Ihsan, Asif; Javaid, Qaisar; Khan, Abid; Manzoor, Jawad; Akhundzada, Adnan; Khan, M Khurram; Farooq, Sajid

    2016-01-01

    Processing large amounts of data in real time for identifying security issues pose several performance challenges, especially when hardware infrastructure is limited. Managed Security Service Providers (MSSP), mostly hosting their applications on the Cloud, receive events at a very high rate that varies from a few hundred to a couple of thousand events per second (EPS). It is critical to process this data efficiently, so that attacks could be identified quickly and necessary response could be initiated. This paper evaluates the performance of a security framework OSTROM built on the Esper complex event processing (CEP) engine under a parallel and non-parallel computational framework. We explain three architectures under which Esper can be used to process events. We investigated the effect on throughput, memory and CPU usage in each configuration setting. The results indicate that the performance of the engine is limited by the number of events coming in rather than the queries being processed. The architecture where 1/4th of the total events are submitted to each instance and all the queries are processed by all the units shows best results in terms of throughput, memory and CPU usage. PMID:27851762

  17. Standardizing the classification of abortion incidents: the Procedural Abortion Incident Reporting and Surveillance (PAIRS) Framework.

    PubMed

    Taylor, Diana; Upadhyay, Ushma D; Fjerstad, Mary; Battistelli, Molly F; Weitz, Tracy A; Paul, Maureen E

    2017-07-01

    To develop and validate standardized criteria for assessing abortion-related incidents (adverse events, morbidities, near misses) for first-trimester aspiration abortion procedures and to demonstrate the utility of a standardized framework [the Procedural Abortion Incident Reporting & Surveillance (PAIRS) Framework] for estimating serious abortion-related adverse events. As part of a California-based study of early aspiration abortion provision conducted between 2007 and 2013, we developed and validated a standardized framework for defining and monitoring first-trimester (≤14weeks) aspiration abortion morbidity and adverse events using multiple methods: a literature review, framework criteria testing with empirical data, repeated expert reviews and data-based revisions to the framework. The final framework distinguishes incidents resulting from procedural abortion care (adverse events) from morbidity related to pregnancy, the abortion process and other nonabortion related conditions. It further classifies incidents by diagnosis (confirmatory data, etiology, risk factors), management (treatment type and location), timing (immediate or delayed), seriousness (minor or major) and outcome. Empirical validation of the framework using data from 19,673 women receiving aspiration abortions revealed almost an equal proportion of total adverse events (n=205, 1.04%) and total abortion- or pregnancy-related morbidity (n=194, 0.99%). The majority of adverse events were due to retained products of conception (0.37%), failed attempted abortion (0.15%) and postabortion infection (0.17%). Serious or major adverse events were rare (n=11, 0.06%). Distinguishing morbidity diagnoses from adverse events using a standardized, empirically tested framework confirms the very low frequency of serious adverse events related to clinic-based abortion care. The PAIRS Framework provides a useful set of tools to systematically classify and monitor abortion-related incidents for first-trimester aspiration abortion procedures. Standardization will assist healthcare providers, researchers and policymakers to anticipate morbidity and prevent abortion adverse events, improve care metrics and enhance abortion quality. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A review of event processing frameworks used in HEP

    DOE PAGES

    Sexton-Kennedy, E.

    2015-12-23

    Today there are many different experimental event processing frameworks in use by running or about to be running experiments. This talk will discuss the different components of these frameworks. In the past there have been attempts at shared framework projects for example the collaborations on the BaBar framework (between BaBar, CDF, and CLEO), on the Gaudi framework (between LHCb and ATLAS), on AliROOT/FairROOT (between Alice and GSI/Fair), and in some ways on art (Fermilab based experiments) and CMS’ framework. However, for reasons that will be discussed, these collaborations did not result in common frameworks shared among the intended experiments. Thoughmore » importantly, two of the resulting projects have succeeded in providing frameworks that are shared among many customer experiments: Fermilab's art framework and GSI/Fair's FairROOT. Interestingly, several projects are considering remerging their frameworks after many years apart. I'll report on an investigation and analysis of these realities. In addition, with the advent of the need for multi-threaded frameworks and the scarce available manpower, it is important to collaborate in the future, however it is also important to understand why previous attempts at multi-experiment frameworks either worked or didn't work.« less

  19. [Assessing program sustainability in public health organizations: a tool-kit application in Haiti].

    PubMed

    Ridde, V; Pluye, P; Queuille, L

    2006-10-01

    Public health stakeholders are concerned about program sustainability. However, they usually conceive sustainability in accordance with financial criteria for at least one reason. No simple frameworks are operationally and theoretically sound enough to globally evaluate program sustainability. The present paper aims to describe an application of one framework assessment tool used to evaluate the sustainability level and process of a Nutritional Care Unit managed by a Swiss humanitarian agency to fight against severe child malnutrition in a Haitian area. The managing agency is committed to put this Unit back into the structure of a local public hospital. The evaluation was performed within the sustainability framework proposed in a former article. Data were collected with a combination of tools, semi-structured interviews (n=33, medical and support staff from the agency and the hospital), participatory observation and document review. Data concerned the four characteristics of organizational routines (memory, adaptation, values and rules) enabling assess to the level of sustainability. In addition, data were related to three types of events distinguishing routinization processes from implementation processes: specific events of routinization, routinization-implementation joint events, and specific events of implementation. Data analysis was thematic and results were validated by actors through a feed-back session and written comments. The current level of sustainability of the Nutritional Care Unit within the Hospital is weak: weak memory, high adaptation, weak sharing of values and rules. This may be explained by the sustainability process, and the absence of specific routinization events. The relevance of such processes is reasonable, while it has been strongly challenged in the troublesome Haitian context. Riots have been widespread over the last years, creating difficulties for the Hospital. This experience suggests the proposed framework and sustainability assessment tools are useful when the context permits scrutinization of program sustainability.

  20. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  1. CLARA: CLAS12 Reconstruction and Analysis Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  2. Adverse Outcome Pathways – Organizing Toxicological ...

    EPA Pesticide Factsheets

    The number of chemicals for which environmental regulatory decisions are required far exceeds the current capacity for toxicity testing. High throughput screening (HTS) commonly used for drug discovery has the potential to increase this capacity. The adverse outcome pathway (AOP) concept has emerged as a natural framework for connecting high throughput toxicity testing (HTT) results to potential impacts on humans and wildlife populations. An AOP consists of two main components that describe the biological mechanisms driving toxicity. Key events represent biological processes essential for causing the adverse outcome that are also measurable experimentally. Key event relationships capture the biological processes connecting the key events. Evidence documented for each KER based on measurements of the KEs can provide the confidence needed for extrapolating HTT from early key events to overt toxicity represented by later key events based on the AOP. The IPCS mode of action (MOA) framework incorporates information required for making a chemical-specific toxicity determination. Given the close relationship between the AOP and MOA frameworks, it is possible to assemble an MOA by incorporating HTT results, chemical properties including absorption, distribution, metabolism, and excretion (ADME), and an AOP describing the biological basis of toxicity thereby streamlining the process. While current applications focus on the assessment of risk for environmental chemicals,

  3. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  4. The Co-Evolution of Knowledge and Event Memory

    ERIC Educational Resources Information Center

    Nelson, Angela B.; Shiffrin, Richard M.

    2013-01-01

    We present a theoretical framework and a simplified simulation model for the co-evolution of knowledge and event memory, both termed SARKAE (Storing and Retrieving Knowledge and Events). Knowledge is formed through the accrual of individual events, a process that operates in tandem with the storage of individual event memories. In 2 studies, new…

  5. Earth Science Data Fusion with Event Building Approach

    NASA Technical Reports Server (NTRS)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  6. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.

    PubMed

    Elgendi, Mohamed

    2016-11-02

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.

  7. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  8. A person-centered integrated care quality framework, based on a qualitative study of patients' evaluation of care in light of chronic care ideals.

    PubMed

    Berntsen, Gro; Høyem, Audhild; Lettrem, Idar; Ruland, Cornelia; Rumpsfeld, Markus; Gammon, Deede

    2018-06-20

    Person-Centered Integrated Care (PC-IC) is believed to improve outcomes and experience for persons with multiple long-term and complex conditions. No broad consensus exists regarding how to capture the patient-experienced quality of PC-IC. Most PC-IC evaluation tools focus on care events or care in general. Building on others' and our previous work, we outlined a 4-stage goal-oriented PC-IC process ideal: 1) Personalized goal setting 2) Care planning aligned with goals 3) Care delivery according to plan, and 4) Evaluation of goal attainment. We aimed to explore, apply, refine and operationalize this quality of care framework. This paper is a qualitative evaluative review of the individual Patient Pathways (iPP) experiences of 19 strategically chosen persons with multimorbidity in light of ideals for chronic care. The iPP includes all care events, addressing the persons collected health issues, organized by time. We constructed iPPs based on the electronic health record (from general practice, nursing services, and hospital) with patient follow-up interviews. The application of the framework and its refinement were parallel processes. Both were based on analysis of salient themes in the empirical material in light of the PC-IC process ideal and progressively more informed applications of themes and questions. The informants consistently reviewed care quality by how care supported/ threatened their long-term goals. Personal goals were either implicit or identified by "What matters to you?" Informants expected care to address their long-term goals and placed responsibility for care quality and delivery at the system level. The PC-IC process framework exposed system failure in identifying long-term goals, provision of shared long-term multimorbidity care plans, monitoring of care delivery and goal evaluation. The PC-IC framework includes descriptions of ideal care, key questions and literature references for each stage of the PC-IC process. This first version of a PC-IC process framework needs further validation in other settings. Gaps in care that are invisible with event-based quality of care frameworks become apparent when evaluated by a long-term goal-driven PC-IC process framework. The framework appears meaningful to persons with multimorbidity.

  9. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  10. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  11. Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts

    DTIC Science & Technology

    1981-05-01

    program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .

  12. Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data

    NASA Astrophysics Data System (ADS)

    Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.

    2017-12-01

    Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.

  13. TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach

    PubMed Central

    Elgendi, Mohamed

    2016-01-01

    Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852

  14. Evolution of the ATLAS Software Framework towards Concurrency

    NASA Astrophysics Data System (ADS)

    Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.

    2015-05-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved for LHC Run 3.

  15. Real Time Global Tests of the ALICE High Level Trigger Data Transport Framework

    NASA Astrophysics Data System (ADS)

    Becker, B.; Chattopadhyay, S.; Cicalo, C.; Cleymans, J.; de Vaux, G.; Fearick, R. W.; Lindenstruth, V.; Richter, M.; Rohrich, D.; Staley, F.; Steinbeck, T. M.; Szostak, A.; Tilsner, H.; Weis, R.; Vilakazi, Z. Z.

    2008-04-01

    The High Level Trigger (HLT) system of the ALICE experiment is an online event filter and trigger system designed for input bandwidths of up to 25 GB/s at event rates of up to 1 kHz. The system is designed as a scalable PC cluster, implementing several hundred nodes. The transport of data in the system is handled by an object-oriented data flow framework operating on the basis of the publisher-subscriber principle, being designed fully pipelined with lowest processing overhead and communication latency in the cluster. In this paper, we report the latest measurements where this framework has been operated on five different sites over a global north-south link extending more than 10,000 km, processing a ldquoreal-timerdquo data flow.

  16. Time, rate, and conditioning.

    PubMed

    Gallistel, C R; Gibbon, J

    2000-04-01

    The authors draw together and develop previous timing models for a broad range of conditioning phenomena to reveal their common conceptual foundations: First, conditioning depends on the learning of the temporal intervals between events and the reciprocals of these intervals, the rates of event occurrence. Second, remembered intervals and rates translate into observed behavior through decision processes whose structure is adapted to noise in the decision variables. The noise and the uncertainties consequent on it have both subjective and objective origins. A third feature of these models is their timescale invariance, which the authors argue is a very important property evident in the available experimental data. This conceptual framework is similar to the psychophysical conceptual framework in which contemporary models of sensory processing are rooted. The authors contrast it with the associative conceptual framework.

  17. Generalised synthesis of space-time variability in flood response: Dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Battista Chirico, Giovanni; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-05-01

    A analytical framework is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  18. Quantifying space-time dynamics of flood event types

    NASA Astrophysics Data System (ADS)

    Viglione, Alberto; Chirico, Giovanni Battista; Komma, Jürgen; Woods, Ross; Borga, Marco; Blöschl, Günter

    2010-11-01

    SummaryA generalised framework of space-time variability in flood response is used to characterise five flood events of different type in the Kamp area in Austria: one long-rain event, two short-rain events, one rain-on-snow event and one snowmelt event. Specifically, the framework quantifies the contributions of the space-time variability of rainfall/snowmelt, runoff coefficient, hillslope and channel routing to the flood runoff volume and the delay and spread of the resulting hydrograph. The results indicate that the components obtained by the framework clearly reflect the individual processes which characterise the event types. For the short-rain events, temporal, spatial and movement components can all be important in runoff generation and routing, which would be expected because of their local nature in time and, particularly, in space. For the long-rain event, the temporal components tend to be more important for runoff generation, because of the more uniform spatial coverage of rainfall, while for routing the spatial distribution of the produced runoff, which is not uniform, is also important. For the rain-on-snow and snowmelt events, the spatio-temporal variability terms typically do not play much role in runoff generation and the spread of the hydrograph is mainly due to the duration of the event. As an outcome of the framework, a dimensionless response number is proposed that represents the joint effect of runoff coefficient and hydrograph peakedness and captures the absolute magnitudes of the observed flood peaks.

  19. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world.

  20. Status of the calibration and alignment framework at the Belle II experiment

    NASA Astrophysics Data System (ADS)

    Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.; Belle Software Group, II

    2017-10-01

    The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.

  1. Measuring adverse events in helicopter emergency medical services: establishing content validity.

    PubMed

    Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M

    2014-01-01

    We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.

  2. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    NASA Astrophysics Data System (ADS)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  3. The knowledge-learning-instruction framework: bridging the science-practice chasm to enhance robust student learning.

    PubMed

    Koedinger, Kenneth R; Corbett, Albert T; Perfetti, Charles

    2012-07-01

    Despite the accumulation of substantial cognitive science research relevant to education, there remains confusion and controversy in the application of research to educational practice. In support of a more systematic approach, we describe the Knowledge-Learning-Instruction (KLI) framework. KLI promotes the emergence of instructional principles of high potential for generality, while explicitly identifying constraints of and opportunities for detailed analysis of the knowledge students may acquire in courses. Drawing on research across domains of science, math, and language learning, we illustrate the analyses of knowledge, learning, and instructional events that the KLI framework affords. We present a set of three coordinated taxonomies of knowledge, learning, and instruction. For example, we identify three broad classes of learning events (LEs): (a) memory and fluency processes, (b) induction and refinement processes, and (c) understanding and sense-making processes, and we show how these can lead to different knowledge changes and constraints on optimal instructional choices. Copyright © 2012 Cognitive Science Society, Inc.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.

    In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less

  5. A computational framework for prime implicants identification in noncoherent dynamic systems.

    PubMed

    Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico

    2015-01-01

    Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.

  6. Understanding HIV disclosure: A review and application of the Disclosure Processes Model

    PubMed Central

    Chaudoir, Stephenie R.; Fisher, Jeffrey D.; Simoni, Jane M.

    2014-01-01

    HIV disclosure is a critical component of HIV/AIDS prevention and treatment efforts, yet the field lacks a comprehensive theoretical framework with which to study how HIV-positive individuals make decisions about disclosing their serostatus and how these decisions affect them. Recent theorizing in the context of the Disclosure Processes Model has suggested that the disclosure process consists of antecedent goals, the disclosure event itself, mediating processes and outcomes, and a feedback loop. In this paper, we apply this new theoretical framework to HIV disclosure in order to review the current state of the literature, identify gaps in existing research, and highlight the implications of the framework for future work in this area. PMID:21514708

  7. Development of a coupled hydrological - hydrodynamic model for probabilistic catchment flood inundation modelling

    NASA Astrophysics Data System (ADS)

    Quinn, Niall; Freer, Jim; Coxon, Gemma; Dunne, Toby; Neal, Jeff; Bates, Paul; Sampson, Chris; Smith, Andy; Parkin, Geoff

    2017-04-01

    Computationally efficient flood inundation modelling systems capable of representing important hydrological and hydrodynamic flood generating processes over relatively large regions are vital for those interested in flood preparation, response, and real time forecasting. However, such systems are currently not readily available. This can be particularly important where flood predictions from intense rainfall are considered as the processes leading to flooding often involve localised, non-linear spatially connected hillslope-catchment responses. Therefore, this research introduces a novel hydrological-hydraulic modelling framework for the provision of probabilistic flood inundation predictions across catchment to regional scales that explicitly account for spatial variability in rainfall-runoff and routing processes. Approaches have been developed to automate the provision of required input datasets and estimate essential catchment characteristics from freely available, national datasets. This is an essential component of the framework as when making predictions over multiple catchments or at relatively large scales, and where data is often scarce, obtaining local information and manually incorporating it into the model quickly becomes infeasible. An extreme flooding event in the town of Morpeth, NE England, in 2008 was used as a first case study evaluation of the modelling framework introduced. The results demonstrated a high degree of prediction accuracy when comparing modelled and reconstructed event characteristics for the event, while the efficiency of the modelling approach used enabled the generation of relatively large ensembles of realisations from which uncertainty within the prediction may be represented. This research supports previous literature highlighting the importance of probabilistic forecasting, particularly during extreme events, which can be often be poorly characterised or even missed by deterministic predictions due to the inherent uncertainty in any model application. Future research will aim to further evaluate the robustness of the approaches introduced by applying the modelling framework to a variety of historical flood events across UK catchments. Furthermore, the flexibility and efficiency of the framework is ideally suited to the examination of the propagation of errors through the model which will help gain a better understanding of the dominant sources of uncertainty currently impacting flood inundation predictions.

  8. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Automated event generation for loop-induced processes

    DOE PAGES

    Hirschi, Valentin; Mattelaer, Olivier

    2015-10-22

    We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less

  10. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks.more » Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.« less

  11. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  12. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  13. Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro

    2016-06-01

    A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.

  14. Predicting Shear Transformation Events in Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Xu, Bin; Falk, Michael L.; Li, J. F.; Kong, L. T.

    2018-03-01

    Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu64 Zr36 metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.

  15. Predicting Shear Transformation Events in Metallic Glasses.

    PubMed

    Xu, Bin; Falk, Michael L; Li, J F; Kong, L T

    2018-03-23

    Shear transformation is the elementary process for plastic deformation of metallic glasses, the prediction of the occurrence of the shear transformation events is therefore of vital importance to understand the mechanical behavior of metallic glasses. In this Letter, from the view of the potential energy landscape, we find that the protocol-dependent behavior of shear transformation is governed by the stress gradient along its minimum energy path and we propose a framework as well as an atomistic approach to predict the triggering strains, locations, and structural transformations of the shear transformation events under different shear protocols in metallic glasses. Verification with a model Cu_{64}Zr_{36} metallic glass reveals that the prediction agrees well with athermal quasistatic shear simulations. The proposed framework is believed to provide an important tool for developing a quantitative understanding of the deformation processes that control mechanical behavior of metallic glasses.

  16. ATLAS Metadata Infrastructure Evolution for Run 2 and Beyond

    NASA Astrophysics Data System (ADS)

    van Gemmeren, P.; Cranshaw, J.; Malon, D.; Vaniachine, A.

    2015-12-01

    ATLAS developed and employed for Run 1 of the Large Hadron Collider a sophisticated infrastructure for metadata handling in event processing jobs. This infrastructure profits from a rich feature set provided by the ATLAS execution control framework, including standardized interfaces and invocation mechanisms for tools and services, segregation of transient data stores with concomitant object lifetime management, and mechanisms for handling occurrences asynchronous to the control framework's state machine transitions. This metadata infrastructure is evolving and being extended for Run 2 to allow its use and reuse in downstream physics analyses, analyses that may or may not utilize the ATLAS control framework. At the same time, multiprocessing versions of the control framework and the requirements of future multithreaded frameworks are leading to redesign of components that use an incident-handling approach to asynchrony. The increased use of scatter-gather architectures, both local and distributed, requires further enhancement of metadata infrastructure in order to ensure semantic coherence and robust bookkeeping. This paper describes the evolution of ATLAS metadata infrastructure for Run 2 and beyond, including the transition to dual-use tools—tools that can operate inside or outside the ATLAS control framework—and the implications thereof. It further examines how the design of this infrastructure is changing to accommodate the requirements of future frameworks and emerging event processing architectures.

  17. Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity

    PubMed Central

    Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.

    2015-01-01

    Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951

  18. Constraints on the Observation of Partial Match Costs: Implications for Transfer-Appropriate Processing Approaches to Immediate Priming

    ERIC Educational Resources Information Center

    Leboe, Jason P.; Leboe, Launa C.; Milliken, Bruce

    2010-01-01

    According to a transfer-appropriate processing framework, immediate priming costs arise from a match between a prime and probe event on 1 dimension and a difference between those 2 events on some other dimension (i.e., a partial match). In Experiment 1, the authors used a Stroop priming procedure to generate 6 variants of partial match, yet only 1…

  19. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems

    PubMed Central

    Hou, Kun-Mean; Zhang, Zhan

    2017-01-01

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem. PMID:29120357

  20. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems.

    PubMed

    Zhou, Peng; Zuo, Decheng; Hou, Kun-Mean; Zhang, Zhan

    2017-11-09

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem.

  1. An Enhanced Text-Mining Framework for Extracting Disaster Relevant Data through Social Media and Remote Sensing Data Fusion

    NASA Astrophysics Data System (ADS)

    Scheele, C. J.; Huang, Q.

    2016-12-01

    In the past decade, the rise in social media has led to the development of a vast number of social media services and applications. Disaster management represents one of such applications leveraging massive data generated for event detection, response, and recovery. In order to find disaster relevant social media data, current approaches utilize natural language processing (NLP) methods based on keywords, or machine learning algorithms relying on text only. However, these approaches cannot be perfectly accurate due to the variability and uncertainty in language used on social media. To improve current methods, the enhanced text-mining framework is proposed to incorporate location information from social media and authoritative remote sensing datasets for detecting disaster relevant social media posts, which are determined by assessing the textual content using common text mining methods and how the post relates spatiotemporally to the disaster event. To assess the framework, geo-tagged Tweets were collected for three different spatial and temporal disaster events: hurricane, flood, and tornado. Remote sensing data and products for each event were then collected using RealEarthTM. Both Naive Bayes and Logistic Regression classifiers were used to compare the accuracy within the enhanced text-mining framework. Finally, the accuracies from the enhanced text-mining framework were compared to the current text-only methods for each of the case study disaster events. The results from this study address the need for more authoritative data when using social media in disaster management applications.

  2. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  3. Cognitive-Behavioral Conceptualization and Treatment of Anger

    ERIC Educational Resources Information Center

    Deffenbacher, Jerry L.

    2011-01-01

    Anger is conceptualized within a broad cognitive-behavioral (CBT) framework emphasizing triggering events; the person's pre-anger state, including temporary conditions and more enduring cognitive and familial/cultural processes; primary and secondary appraisal processes; the anger experience/response (cognitive, emotional, and physiological…

  4. Leveraging Semantic Labels for Multi-level Abstraction in Medical Process Mining and Trace Comparison.

    PubMed

    Leonardi, Giorgio; Striani, Manuel; Quaglini, Silvana; Cavallini, Anna; Montani, Stefania

    2018-05-21

    Many medical information systems record data about the executed process instances in the form of an event log. In this paper, we present a framework, able to convert actions in the event log into higher level concepts, at different levels of abstraction, on the basis of domain knowledge. Abstracted traces are then provided as an input to trace comparison and semantic process discovery. Our abstraction mechanism is able to manage non trivial situations, such as interleaved actions or delays between two actions that abstract to the same concept. Trace comparison resorts to a similarity metric able to take into account abstraction phase penalties, and to deal with quantitative and qualitative temporal constraints in abstracted traces. As for process discovery, we rely on classical algorithms embedded in the framework ProM, made semantic by the capability of abstracting the actions on the basis of their conceptual meaning. The approach has been tested in stroke care, where we adopted abstraction and trace comparison to cluster event logs of different stroke units, to highlight (in)correct behavior, abstracting from details. We also provide process discovery results, showing how the abstraction mechanism allows to obtain stroke process models more easily interpretable by neurologists. Copyright © 2018. Published by Elsevier Inc.

  5. A framework for modeling scenario-based barrier island storm impacts

    USGS Publications Warehouse

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  6. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  7. Simple stochastic model for El Niño with westerly wind bursts

    PubMed Central

    Thual, Sulian; Majda, Andrew J.; Chen, Nan; Stechmann, Samuel N.

    2016-01-01

    Atmospheric wind bursts in the tropics play a key role in the dynamics of the El Niño Southern Oscillation (ENSO). A simple modeling framework is proposed that summarizes this relationship and captures major features of the observational record while remaining physically consistent and amenable to detailed analysis. Within this simple framework, wind burst activity evolves according to a stochastic two-state Markov switching–diffusion process that depends on the strength of the western Pacific warm pool, and is coupled to simple ocean–atmosphere processes that are otherwise deterministic, stable, and linear. A simple model with this parameterization and no additional nonlinearities reproduces a realistic ENSO cycle with intermittent El Niño and La Niña events of varying intensity and strength as well as realistic buildup and shutdown of wind burst activity in the western Pacific. The wind burst activity has a direct causal effect on the ENSO variability: in particular, it intermittently triggers regular El Niño or La Niña events, super El Niño events, or no events at all, which enables the model to capture observed ENSO statistics such as the probability density function and power spectrum of eastern Pacific sea surface temperatures. The present framework provides further theoretical and practical insight on the relationship between wind burst activity and the ENSO. PMID:27573821

  8. Causal learning and inference as a rational process: the new synthesis.

    PubMed

    Holyoak, Keith J; Cheng, Patricia W

    2011-01-01

    Over the past decade, an active line of research within the field of human causal learning and inference has converged on a general representational framework: causal models integrated with bayesian probabilistic inference. We describe this new synthesis, which views causal learning and inference as a fundamentally rational process, and review a sample of the empirical findings that support the causal framework over associative alternatives. Causal events, like all events in the distal world as opposed to our proximal perceptual input, are inherently unobservable. A central assumption of the causal approach is that humans (and potentially nonhuman animals) have been designed in such a way as to infer the most invariant causal relations for achieving their goals based on observed events. In contrast, the associative approach assumes that learners only acquire associations among important observed events, omitting the representation of the distal relations. By incorporating bayesian inference over distributions of causal strength and causal structures, along with noisy-logical (i.e., causal) functions for integrating the influences of multiple causes on a single effect, human judgments about causal strength and structure can be predicted accurately for relatively simple causal structures. Dynamic models of learning based on the causal framework can explain patterns of acquisition observed with serial presentation of contingency data and are consistent with available neuroimaging data. The approach has been extended to a diverse range of inductive tasks, including category-based and analogical inferences.

  9. Framework for Detection and Localization of Extreme Climate Event with Pixel Recursive Super Resolution

    NASA Astrophysics Data System (ADS)

    Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.

    2017-12-01

    Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.

  10. Virtual reality for spherical images

    NASA Astrophysics Data System (ADS)

    Pilarczyk, Rafal; Skarbek, Władysław

    2017-08-01

    Paper presents virtual reality application framework and application concept for mobile devices. Framework uses Google Cardboard library for Android operating system. Framework allows to create virtual reality 360 video player using standard OpenGL ES rendering methods. Framework provides network methods in order to connect to web server as application resource provider. Resources are delivered using JSON response as result of HTTP requests. Web server also uses Socket.IO library for synchronous communication between application and server. Framework implements methods to create event driven process of rendering additional content based on video timestamp and virtual reality head point of view.

  11. Piloting a mass gathering conceptual framework at an Adelaide schoolies festival.

    PubMed

    Hutton, Alison; Munt, Rebecca; Zeitz, Kathryn; Cusack, Lynette; Kako, Mayumi; Arbon, Paul

    2010-01-01

    During the summer months in Australia, school leavers celebrate their end of school life at schoolies festivals around the nation. These events are typically described as a mass gathering as they are an organised event taking place within a defined space, attended by a large number of people. A project was undertaken to analyse the usefulness of Arbon's (2004) conceptual model of mass gatherings in order to develop a process to better understand the Adelaide Schoolies Festival. Arbon's conceptual framework describes the inter-relationship between the psychosocial, environmental and bio-medical domains of a mass gathering. Each domain has set characteristics which help to understand the impact on the mass gathering event. The characteristics within three domains were collected using field work and bio-medical data to examine the relationship between injury and illness rates. Using the conceptual framework to evaluate this schoolies event helped create an understanding of the physiology, environment and behaviour contributing to patient presentations. Results showed that the schoolies crowd was active and energetic, and the main crowd behaviour observed was dancing and socialising with friends. The environmental domain was characterised by a grassy outdoor venue that was bounded and dry. Due to the overall health of the crowd, activities undertaken and the supportive environment, the majority of injuries to schoolies were minor (68%). However, twenty-four percent of schoolies who presented with alcohol related illness were found to have consumed alcohol at risky levels; half of this cohort was transported to hospital. The conceptual framework successfully guided a higher level of examination of the mass gathering event. In particular, the framework facilitated a greater understanding of the inter-relationships of the various characteristics of a mass gathering event, in this case the Adelaide Schoolies Festival.

  12. Foundations for Streaming Model Transformations by Complex Event Processing.

    PubMed

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  13. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  14. Biasing spatial attention with semantic information: an event coding approach.

    PubMed

    Amer, Tarek; Gozli, Davood G; Pratt, Jay

    2017-04-21

    We investigated the influence of conceptual processing on visual attention from the standpoint of Theory of Event Coding (TEC). The theory makes two predictions: first, an important factor in determining the influence of event 1 on processing event 2 is whether features of event 1 are bound into a unified representation (i.e., selection or retrieval of event 1). Second, whether processing the two events facilitates or interferes with each other should depend on the extent to which their constituent features overlap. In two experiments, participants performed a visual-attention cueing task, in which the visual target (event 2) was preceded by a relevant or irrelevant explicit (e.g., "UP") or implicit (e.g., "HAPPY") spatial-conceptual cue (event 1). Consistent with TEC, we found relevant explicit cues (which featurally overlap to a greater extent with the target) and implicit cues (which featurally overlap to a lesser extent), respectively, facilitated and interfered with target processing at compatible locations. Irrelevant explicit and implicit cues, on the other hand, both facilitated target processing, presumably because they were less likely selected or retrieved as an integrated and unified event file. We argue that such effects, often described as "attentional cueing", are better accounted for within the event coding framework.

  15. A collaborative computing framework of cloud network and WBSN applied to fall detection and 3-D motion reconstruction.

    PubMed

    Lai, Chin-Feng; Chen, Min; Pan, Jeng-Shyang; Youn, Chan-Hyun; Chao, Han-Chieh

    2014-03-01

    As cloud computing and wireless body sensor network technologies become gradually developed, ubiquitous healthcare services prevent accidents instantly and effectively, as well as provides relevant information to reduce related processing time and cost. This study proposes a co-processing intermediary framework integrated cloud and wireless body sensor networks, which is mainly applied to fall detection and 3-D motion reconstruction. In this study, the main focuses includes distributed computing and resource allocation of processing sensing data over the computing architecture, network conditions and performance evaluation. Through this framework, the transmissions and computing time of sensing data are reduced to enhance overall performance for the services of fall events detection and 3-D motion reconstruction.

  16. New Decision Support for Landslide and Other Disaster Events

    NASA Astrophysics Data System (ADS)

    Nair, U. S.; Keiser, K.; Wu, Y.; Kaulfus, A.; Srinivasan, K.; Anderson, E. R.; McEniry, M.

    2013-12-01

    An Event-Driven Data delivery (ED3) framework has been created that provides reusable services and configurations to support better data preparedness for decision support of disasters and other events by rapidly providing pre-planned access to data, special processing, modeling and other capabilities, all executed in response to criteria-based events. ED3 facilitates decision makers to plan in advance of disasters and other types of events for the data necessary for decisions and response activities. A layer of services provided in the ED3 framework allows systems to support user definition of subscriptions for data plans that will be triggered when events matching specified criteria occur. Pre-planning for data in response to events lessens the burden on decision makers in the aftermath of an event and allows planners to think through the desired processing for specialized data products. Additionally the ED3 framework provides support for listening for event alerts and support for multiple workflow managers that provide data and processing functionality in response to events. Landslides are often costly and, at times, deadly disaster events. Whereas intense and/or sustained rainfall is often the primary trigger for landslides, soil type and slope are also important factors in determining the location and timing of slope failure. Accounting for the substantial spatial variability of these factors is one of the major difficulties when predicting the timing and location of slope failures. A wireless sensor network (WSN), developed by NASA SERVIR and USRA, with peer-to-peer communication capability and low power consumption, is ideal for high spatial in situ monitoring in remote locations. In collaboration with the University of Huntsville at Alabama, WSN equipped with accelerometer, rainfall and soil moisture sensors is being integrated into an end-to-end landslide warning system. The WSN is being tested to ascertain communication capabilities and the density of nodes required depending upon the nature of terrain and land cover. The performance of a water table model, to be utilized in the end-to-end system, is being evaluated by comparing against landslides that occurred during the 6th and 7th of May, 2003 and 20th and 21st of April, 2011. The model provides a deterministic assessment of slope stability by evaluating horizontal and vertical transport of underground water and associated weight bearing capacity. In the proposed end-to-end system, the model will be coupled to the WSN, and the in situ data collected will be used to drive the model. The output from the model could be communicated back to the WSN providing the capability of generating warning of possible events to the ED3 framework to trigger additional data retrieval or the processing of additional models based on decision maker's ED3 preparedness plans. NASA's Applied Science Program has funded a feasibility study of the ED3 technology and as a result the capability is on track be integrated into existing decision support systems, with an initial reference implementation hosted at the Global Hydrology Resource Center, a NASA distributed active archive center (DAAC).

  17. A framework for analysis of sentinel events in medical student education.

    PubMed

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  18. Information Processing of Trauma.

    ERIC Educational Resources Information Center

    Hartman, Carol R.; Burgess, Ann W.

    1993-01-01

    This paper presents a neuropsychosocial model of information processing to explain a victimization experience, specifically child sexual abuse. It surveys the relation of sensation, perception, and cognition as a systematic way to provide a framework for studying human behavior and describing human response to traumatic events. (Author/JDD)

  19. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  20. A software framework for real-time multi-modal detection of microsleeps.

    PubMed

    Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D

    2017-09-01

    A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.

  1. Knowledge Discovery, Integration and Communication for Extreme Weather and Flood Resilience Using Artificial Intelligence: Flood AI Alpha

    NASA Astrophysics Data System (ADS)

    Demir, I.; Sermet, M. Y.

    2016-12-01

    Nobody is immune from extreme events or natural hazards that can lead to large-scale consequences for the nation and public. One of the solutions to reduce the impacts of extreme events is to invest in improving resilience with the ability to better prepare, plan, recover, and adapt to disasters. The National Research Council (NRC) report discusses the topic of how to increase resilience to extreme events through a vision of resilient nation in the year 2030. The report highlights the importance of data, information, gaps and knowledge challenges that needs to be addressed, and suggests every individual to access the risk and vulnerability information to make their communities more resilient. This abstracts presents our project on developing a resilience framework for flooding to improve societal preparedness with objectives; (a) develop a generalized ontology for extreme events with primary focus on flooding; (b) develop a knowledge engine with voice recognition, artificial intelligence, natural language processing, and inference engine. The knowledge engine will utilize the flood ontology and concepts to connect user input to relevant knowledge discovery outputs on flooding; (c) develop a data acquisition and processing framework from existing environmental observations, forecast models, and social networks. The system will utilize the framework, capabilities and user base of the Iowa Flood Information System (IFIS) to populate and test the system; (d) develop a communication framework to support user interaction and delivery of information to users. The interaction and delivery channels will include voice and text input via web-based system (e.g. IFIS), agent-based bots (e.g. Microsoft Skype, Facebook Messenger), smartphone and augmented reality applications (e.g. smart assistant), and automated web workflows (e.g. IFTTT, CloudWork) to open the knowledge discovery for flooding to thousands of community extensible web workflows.

  2. Identity in agent-based models : modeling dynamic multiscale social processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozik, J.; Sallach, D. L.; Macal, C. M.

    Identity-related issues play central roles in many current events, including those involving factional politics, sectarianism, and tribal conflicts. Two popular models from the computational-social-science (CSS) literature - the Threat Anticipation Program and SharedID models - incorporate notions of identity (individual and collective) and processes of identity formation. A multiscale conceptual framework that extends some ideas presented in these models and draws other capabilities from the broader CSS literature is useful in modeling the formation of political identities. The dynamic, multiscale processes that constitute and transform social identities can be mapped to expressive structures of the framework

  3. Multilevel analysis of sports video sequences

    NASA Astrophysics Data System (ADS)

    Han, Jungong; Farin, Dirk; de With, Peter H. N.

    2006-01-01

    We propose a fully automatic and flexible framework for analysis and summarization of tennis broadcast video sequences, using visual features and specific game-context knowledge. Our framework can analyze a tennis video sequence at three levels, which provides a broad range of different analysis results. The proposed framework includes novel pixel-level and object-level tennis video processing algorithms, such as a moving-player detection taking both the color and the court (playing-field) information into account, and a player-position tracking algorithm based on a 3-D camera model. Additionally, we employ scene-level models for detecting events, like service, base-line rally and net-approach, based on a number real-world visual features. The system can summarize three forms of information: (1) all court-view playing frames in a game, (2) the moving trajectory and real-speed of each player, as well as relative position between the player and the court, (3) the semantic event segments in a game. The proposed framework is flexible in choosing the level of analysis that is desired. It is effective because the framework makes use of several visual cues obtained from the real-world domain to model important events like service, thereby increasing the accuracy of the scene-level analysis. The paper presents attractive experimental results highlighting the system efficiency and analysis capabilities.

  4. Experimental Learning Enhancing Improvisation Skills

    ERIC Educational Resources Information Center

    Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa

    2016-01-01

    Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…

  5. Generation of Department of Defense Architecture Framework (DODAF) Models Using the Monterey Phoenix Behavior Modeling Approach

    DTIC Science & Technology

    2015-09-01

    63 Figure 30. Order processing state diagram (after Fowler and Scott 1997). ......................64 x Figure 32. Four of...events, precedence and inclusion. Figure 30 shows an OV-6b for order processing states. 64 Figure 30. Order processing state diagram (after Fowler... Order Processing State Transition Starts at checking order Ends at order delivered

  6. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  7. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    PubMed

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the implementation process of old practices, before further development of new practices or supportive tools.

  8. A World Health Organization field trial assessing a proposed ICD-11 framework for classifying patient safety events.

    PubMed

    Forster, Alan J; Bernard, Burnand; Drösler, Saskia E; Gurevich, Yana; Harrison, James; Januel, Jean-Marie; Romano, Patrick S; Southern, Danielle A; Sundararajan, Vijaya; Quan, Hude; Vanderloo, Saskia E; Pincus, Harold A; Ghali, William A

    2017-08-01

    To assess the utility of the proposed World Health Organization (WHO)'s International Classification of Disease (ICD) framework for classifying patient safety events. Independent classification of 45 clinical vignettes using a web-based platform. The WHO's multi-disciplinary Quality and Safety Topic Advisory Group. The framework consists of three concepts: harm, cause and mode. We defined a concept as 'classifiable' if more than half of the raters could assign an ICD-11 code for the case. We evaluated reasons why cases were nonclassifiable using a qualitative approach. Harm was classifiable in 31 of 45 cases (69%). Of these, only 20 could be classified according to cause and mode. Classifiable cases were those in which a clear cause and effect relationship existed (e.g. medication administration error). Nonclassifiable cases were those without clear causal attribution (e.g. pressure ulcer). Of the 14 cases in which harm was not evident (31%), only 5 could be classified according to cause and mode and represented potential adverse events. Overall, nine cases (20%) were nonclassifiable using the three-part patient safety framework and contained significant ambiguity in the relationship between healthcare outcome and putative cause. The proposed framework enabled classification of the majority of patient safety events. Cases in which potentially harmful events did not cause harm were not classifiable; additional code categories within the ICD-11 are one proposal to address this concern. Cases with ambiguity in cause and effect relationship between healthcare processes and outcomes remain difficult to classify. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  9. Suppression of Story Character Goals during Reading

    ERIC Educational Resources Information Center

    Linderholm, Tracy; Gernsbacher, Morton Ann; van den Broek, Paul; Neninde, Lana; Robertson, Rachel R. W.; Sundermier, Brian

    2004-01-01

    The objective of this study was to determine how readers process narrative texts when the main character has multiple, and changing, goals. Readers must keep track of such goals to understand the causal relations between text events, an important process for comprehension. The structure building framework theory of reading proposes that readers…

  10. Item Memory, Context Memory and the Hippocampus: fMRI Evidence

    ERIC Educational Resources Information Center

    Rugg, Michael D.; Vilberg, Kaia L.; Mattson, Julia T.; Yu, Sarah S.; Johnson, Jeffrey D.; Suzuki, Maki

    2012-01-01

    Dual-process models of recognition memory distinguish between the retrieval of qualitative information about a prior event (recollection), and judgments of prior occurrence based on an acontextual sense of familiarity. fMRI studies investigating the neural correlates of memory encoding and retrieval conducted within the dual-process framework have…

  11. Defining a quantitative framework for evaluation and optimisation of the environmental impacts of mega-event projects.

    PubMed

    Parkes, Olga; Lettieri, Paola; Bogle, I David L

    2016-02-01

    This paper presents a novel quantitative methodology for the evaluation and optimisation of the environmental impacts of the whole life cycle of a mega-event project: construction and staging the event and post-event site redevelopment and operation. Within the proposed framework, a mathematical model has been developed that takes into account greenhouse gas (GHG) emissions resulting from use of transportation fuel, energy, water and construction materials used at all stages of the mega-event project. The model is applied to a case study - the London Olympic Park. Three potential post-event site design scenarios of the Park have been developed: Business as Usual (BAU), Commercial World (CW) and High Rise High Density (HRHD). A quantitative summary of results demonstrates that the highest GHG emissions associated with the actual event are almost negligible compared to those associated with the legacy phase. The highest share of emissions in the legacy phase is attributed to embodied emissions from construction materials (almost 50% for the BAU and HRHD scenarios) and emissions resulting from the transportation of residents, visitors and employees to/from the site (almost 60% for the CW scenario). The BAU scenario is the one with the lowest GHG emissions compared to the other scenarios. The results also demonstrate how post-event site design scenarios can be optimised to minimise the GHG emissions. The overall outcomes illustrate how the proposed framework can be used to support decision making process for mega-event projects planning. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Unsupervised Spatio-Temporal Data Mining Framework for Burned Area Mapping

    NASA Technical Reports Server (NTRS)

    Kumar, Vipin (Inventor); Boriah, Shyam (Inventor); Mithal, Varun (Inventor); Khandelwal, Ankush (Inventor)

    2016-01-01

    A method reduces processing time required to identify locations burned by fire by receiving a feature value for each pixel in an image, each pixel representing a sub-area of a location. Pixels are then grouped based on similarities of the feature values to form candidate burn events. For each candidate burn event, a probability that the candidate burn event is a true burn event is determined based on at least one further feature value for each pixel in the candidate burn event. Candidate burn events that have a probability below a threshold are removed from further consideration as burn events to produce a set of remaining candidate burn events.

  13. A Behavioral Framework for Managing Massive Airline Flight Disruptions through Crisis Management, Organization Development, and Organization Learning

    NASA Astrophysics Data System (ADS)

    Larsen, Tulinda Deegan

    In this study the researcher provides a behavioral framework for managing massive airline flight disruptions (MAFD) in the United States. Under conditions of MAFD, multiple flights are disrupted throughout the airline's route network, customer service is negatively affected, additional costs are created for airlines, and governments intervene. This study is different from other studies relating to MAFD that have focused on the operational, technical, economic, financial, and customer service impacts. The researcher argues that airlines could improve the management of events that led to MAFD by applying the principles of crisis management where the entire organization is mobilized, rather than one department, adapting organization development (OD) interventions to implement change and organization learning (OL) processes to create culture of innovation, resulting in sustainable improvement in customer service, cost reductions, and mitigation of government intervention. At the intersection of crisis management, OD, and OL, the researcher has developed a new conceptual framework that enhances the resiliency of individuals and organizations in responding to unexpected-yet-recurring crises (e.g., MAFD) that impact operations. The researcher has adapted and augmented Lalonde's framework for managing crises through OD interventions by including OL processes. The OD interventions, coupled with OL, provide a framework for airline leaders to manage more effectively events that result in MAFD with the goal of improving passenger satisfaction, reducing costs, and preventing further government intervention. Further research is warranted to apply this conceptual framework to unexpected-yet-recurring crises that affect operations in other industries.

  14. An integrative process model of leadership: examining loci, mechanisms, and event cycles.

    PubMed

    Eberly, Marion B; Johnson, Michael D; Hernandez, Morela; Avolio, Bruce J

    2013-09-01

    Utilizing the locus (source) and mechanism (transmission) of leadership framework (Hernandez, Eberly, Avolio, & Johnson, 2011), we propose and examine the application of an integrative process model of leadership to help determine the psychological interactive processes that constitute leadership. In particular, we identify the various dynamics involved in generating leadership processes by modeling how the loci and mechanisms interact through a series of leadership event cycles. We discuss the major implications of this model for advancing an integrative understanding of what constitutes leadership and its current and future impact on the field of psychological theory, research, and practice. © 2013 APA, all rights reserved.

  15. Transient Volcano Deformation Event Detection over Variable Spatial Scales in Alaska

    NASA Astrophysics Data System (ADS)

    Li, J. D.; Rude, C. M.; Gowanlock, M.; Herring, T.; Pankratius, V.

    2016-12-01

    Transient deformation events driven by volcanic activity can be monitored using increasingly dense networks of continuous Global Positioning System (GPS) ground stations. The wide spatial extent of GPS networks, the large number of GPS stations, and the spatially and temporally varying scale of deformation events result in the mixing of signals from multiple sources. Typical analysis then necessitates manual identification of times and regions of volcanic activity for further study and the careful tuning of algorithmic parameters to extract possible transient events. Here we present a computer-aided discovery system that facilitates the discovery of potential transient deformation events at volcanoes by providing a framework for selecting varying spatial regions of interest and for tuning the analysis parameters. This site specification step in the framework reduces the spatial mixing of signals from different volcanic sources before applying filters to remove interfering signals originating from other geophysical processes. We analyze GPS data recorded by the Plate Boundary Observatory network and volcanic activity logs from the Alaska Volcano Observatory to search for and characterize transient inflation events in Alaska. We find 3 transient inflation events between 2008 and 2015 at the Akutan, Westdahl, and Shishaldin volcanoes in the Aleutian Islands. The inflation event detected in the first half of 2008 at Akutan is validated other studies, while the inflation events observed in early 2011 at Westdahl and in early 2013 at Shishaldin are previously unreported. Our analysis framework also incorporates modelling of the transient inflation events and enables a comparison of different magma chamber inversion models. Here, we also estimate the magma sources that best describe the deformation observed by the GPS stations at Akutan, Westdahl, and Shishaldin. We acknowledge support from NASA AIST-NNX15AG84G (PI: V. Pankratius).

  16. General overview of the disaster management framework in Cameroon.

    PubMed

    Bang, Henry Ngenyam

    2014-07-01

    Efficient and effective disaster management will prevent many hazardous events from becoming disasters. This paper constitutes the most comprehensive document on the natural disaster management framework of Cameroon. It reviews critically disaster management in Cameroon, examining the various legislative, institutional, and administrative frameworks that help to facilitate the process. Furthermore, it illuminates the vital role that disaster managers at the national, regional, and local level play to ease the process. Using empirical data, the study analyses the efficiency and effectiveness of the actions of disaster managers. Its findings reveal inadequate disaster management policies, poor coordination between disaster management institutions at the national level, the lack of trained disaster managers, a skewed disaster management system, and a top-down hierarchical structure within Cameroon's disaster management framework. By scrutinising the disaster management framework of the country, policy recommendations based on the research findings are made on the institutional and administrative frameworks. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  17. Toxicogenomics and cancer risk assessment: a framework for key event analysis and dose-response assessment for nongenotoxic carcinogens.

    PubMed

    Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L

    2010-12-01

    In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. EEG Recording and Online Signal Processing on Android: A Multiapp Framework for Brain-Computer Interfaces on Smartphone

    PubMed Central

    Debener, Stefan; Emkes, Reiner; Volkening, Nils; Fudickar, Sebastian; Bleichner, Martin G.

    2017-01-01

    Objective Our aim was the development and validation of a modular signal processing and classification application enabling online electroencephalography (EEG) signal processing on off-the-shelf mobile Android devices. The software application SCALA (Signal ProCessing and CLassification on Android) supports a standardized communication interface to exchange information with external software and hardware. Approach In order to implement a closed-loop brain-computer interface (BCI) on the smartphone, we used a multiapp framework, which integrates applications for stimulus presentation, data acquisition, data processing, classification, and delivery of feedback to the user. Main Results We have implemented the open source signal processing application SCALA. We present timing test results supporting sufficient temporal precision of audio events. We also validate SCALA with a well-established auditory selective attention paradigm and report above chance level classification results for all participants. Regarding the 24-channel EEG signal quality, evaluation results confirm typical sound onset auditory evoked potentials as well as cognitive event-related potentials that differentiate between correct and incorrect task performance feedback. Significance We present a fully smartphone-operated, modular closed-loop BCI system that can be combined with different EEG amplifiers and can easily implement other paradigms. PMID:29349070

  19. EEG Recording and Online Signal Processing on Android: A Multiapp Framework for Brain-Computer Interfaces on Smartphone.

    PubMed

    Blum, Sarah; Debener, Stefan; Emkes, Reiner; Volkening, Nils; Fudickar, Sebastian; Bleichner, Martin G

    2017-01-01

    Our aim was the development and validation of a modular signal processing and classification application enabling online electroencephalography (EEG) signal processing on off-the-shelf mobile Android devices. The software application SCALA (Signal ProCessing and CLassification on Android) supports a standardized communication interface to exchange information with external software and hardware. In order to implement a closed-loop brain-computer interface (BCI) on the smartphone, we used a multiapp framework, which integrates applications for stimulus presentation, data acquisition, data processing, classification, and delivery of feedback to the user. We have implemented the open source signal processing application SCALA. We present timing test results supporting sufficient temporal precision of audio events. We also validate SCALA with a well-established auditory selective attention paradigm and report above chance level classification results for all participants. Regarding the 24-channel EEG signal quality, evaluation results confirm typical sound onset auditory evoked potentials as well as cognitive event-related potentials that differentiate between correct and incorrect task performance feedback. We present a fully smartphone-operated, modular closed-loop BCI system that can be combined with different EEG amplifiers and can easily implement other paradigms.

  20. A Neurally Plausible Parallel Distributed Processing Model of Event-Related Potential Word Reading Data

    ERIC Educational Resources Information Center

    Laszlo, Sarah; Plaut, David C.

    2012-01-01

    The Parallel Distributed Processing (PDP) framework has significant potential for producing models of cognitive tasks that approximate how the brain performs the same tasks. To date, however, there has been relatively little contact between PDP modeling and data from cognitive neuroscience. In an attempt to advance the relationship between…

  1. Pattern-Directed Processing of Knowledge from Texts.

    ERIC Educational Resources Information Center

    Thorndyke, Perry W.

    A framework for viewing human text comprehension, memory, and recall is presented that assumes patterns of abstract conceptual relations are used to guide processing. These patterns consist of clusters of knowledge that encode prototypical co-occurrences of situations and events in narrative texts. The patterns are assumed to be a part of a…

  2. Reformulated pavement remaining service life framework.

    DOT National Transportation Integrated Search

    2013-11-01

    "Many important decisions are necessary in order to effectively provide and manage a pavement network. At the heart : of this process is the prediction of needed future construction events. One approach to providing a single numeric on : the conditio...

  3. Framework for assessing the capacity of a health ministry to conduct health policy processes--a case study from Tajikistan.

    PubMed

    Mirzoev, Tolib N; Green, Andrew; Van Kalliecharan, Ricky

    2015-01-01

    An adequate capacity of ministries of health (MOH) to develop and implement policies is essential. However, no frameworks were found assessing MOH capacity to conduct health policy processes within developing countries. This paper presents a conceptual framework for assessing MOH capacity to conduct policy processes based on a study from Tajikistan, a former Soviet republic where independence highlighted capacity challenges. The data collection for this qualitative study included in-depth interviews, document reviews and observations of policy events. Framework approach for analysis was used. The conceptual framework was informed by existing literature, guided the data collection and analysis, and was subsequently refined following insights from the study. The Tajik MOH capacity, while gradually improving, remains weak. There is poor recognition of wider contextual influences, ineffective leadership and governance as reflected in centralised decision-making, limited use of evidence, inadequate actors' participation and ineffective use of resources to conduct policy processes. However, the question is whether this is a reflection of lack of MOH ability or evidence of constraining environment or both. The conceptual framework identifies five determinants of robust policy processes, each with specific capacity needs: policy context, MOH leadership and governance, involvement of policy actors, the role of evidence and effective resource use for policy processes. Three underlying considerations are important for applying the capacity to policy processes: the need for clear focus, recognition of capacity levels and elements, and both ability and enabling environment. The proposed framework can be used in assessing and strengthening of the capacity of different policy actors. Copyright © 2013 John Wiley & Sons, Ltd.

  4. A multiscale, hierarchical model of pulse dynamics in arid-land ecosystems

    USGS Publications Warehouse

    Collins, Scott L.; Belnap, Jayne; Grimm, N. B.; Rudgers, J. A.; Dahm, Clifford N.; D'Odorico, P.; Litvak, M.; Natvig, D. O.; Peters, Douglas C.; Pockman, W. T.; Sinsabaugh, R. L.; Wolf, B. O.

    2014-01-01

    Ecological processes in arid lands are often described by the pulse-reserve paradigm, in which rain events drive biological activity until moisture is depleted, leaving a reserve. This paradigm is frequently applied to processes stimulated by one or a few precipitation events within a growing season. Here we expand the original framework in time and space and include other pulses that interact with rainfall. This new hierarchical pulse-dynamics framework integrates space and time through pulse-driven exchanges, interactions, transitions, and transfers that occur across individual to multiple pulses extending from micro to watershed scales. Climate change will likely alter the size, frequency, and intensity of precipitation pulses in the future, and arid-land ecosystems are known to be highly sensitive to climate variability. Thus, a more comprehensive understanding of arid-land pulse dynamics is needed to determine how these ecosystems will respond to, and be shaped by, increased climate variability.

  5. Tumor evolutionary directed graphs and the history of chronic lymphocytic leukemia.

    PubMed

    Wang, Jiguang; Khiabanian, Hossein; Rossi, Davide; Fabbri, Giulia; Gattei, Valter; Forconi, Francesco; Laurenti, Luca; Marasca, Roberto; Del Poeta, Giovanni; Foà, Robin; Pasqualucci, Laura; Gaidano, Gianluca; Rabadan, Raul

    2014-12-11

    Cancer is a clonal evolutionary process, caused by successive accumulation of genetic alterations providing milestones of tumor initiation, progression, dissemination, and/or resistance to certain therapeutic regimes. To unravel these milestones we propose a framework, tumor evolutionary directed graphs (TEDG), which is able to characterize the history of genetic alterations by integrating longitudinal and cross-sectional genomic data. We applied TEDG to a chronic lymphocytic leukemia (CLL) cohort of 70 patients spanning 12 years and show that: (a) the evolution of CLL follows a time-ordered process represented as a global flow in TEDG that proceeds from initiating events to late events; (b) there are two distinct and mutually exclusive evolutionary paths of CLL evolution; (c) higher fitness clones are present in later stages of the disease, indicating a progressive clonal replacement with more aggressive clones. Our results suggest that TEDG may constitute an effective framework to recapitulate the evolutionary history of tumors.

  6. Integration of the Eventlndex with other ATLAS systems

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Gallas, E. J.; Prokoshin, F.

    2015-12-01

    The ATLAS EventIndex System, developed for use in LHC Run 2, is designed to index every processed event in ATLAS, replacing the TAG System used in Run 1. Its storage infrastructure, based on Hadoop open-source software framework, necessitates revamping how information in this system relates to other ATLAS systems. It will store more indexes since the fundamental mechanisms for retrieving these indexes will be better integrated into all stages of data processing, allowing more events from later stages of processing to be indexed than was possible with the previous system. Connections with other systems (conditions database, monitoring) are fundamentally critical to assess dataset completeness, identify data duplication, and check data integrity, and also enhance access to information in EventIndex by user and system interfaces. This paper gives an overview of the ATLAS systems involved, the relevant metadata, and describe the technologies we are deploying to complete these connections.

  7. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaw, Kim Siang

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  8. Improvements of the ALICE HLT data transport framework for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Rohr, David; Krzwicki, Mikolaj; Engel, Heiko; Lehrbach, Johannes; Lindenstruth, Volker; ALICE Collaboration

    2017-10-01

    The ALICE HLT uses a data transport framework based on the publisher- subscriber message principle, which transparently handles the communication between processing components over the network and between processing components on the same node via shared memory with a zero copy approach. We present an analysis of the performance in terms of maximum achievable data rates and event rates as well as processing capabilities during Run 1 and Run 2. Based on this analysis, we present new optimizations we have developed for ALICE in Run 2. These include support for asynchronous transport via Zero-MQ which enables loops in the reconstruction chain graph and which is used to ship QA histograms to DQM. We have added asynchronous processing capabilities in order to support long-running tasks besides the event-synchronous reconstruction tasks in normal HLT operation. These asynchronous components run in an isolated process such that the HLT as a whole is resilient even to fatal errors in these asynchronous components. In this way, we can ensure that new developments cannot break data taking. On top of that, we have tuned the processing chain to cope with the higher event and data rates expected from the new TPC readout electronics (RCU2) and we have improved the configuration procedure and the startup time in order to increase the time where ALICE can take physics data. We analyze the maximum achievable data processing rates taking into account processing capabilities of CPUs and GPUs, buffer sizes, network bandwidth, the incoming links from the detectors, and the outgoing links to data acquisition.

  9. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  10. Consequence Prioritization Process for Potential High Consequence Events (HCE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Sarah G.

    2016-10-31

    This document describes the process for Consequence Prioritization, the first phase of the Consequence-Driven Cyber-Informed Engineering (CCE) framework. The primary goal of Consequence Prioritization is to identify potential disruptive events that would significantly inhibit an organization’s ability to provide the critical services and functions deemed fundamental to their business mission. These disruptive events, defined as High Consequence Events (HCE), include both events that have occurred or could be realized through an attack of critical infrastructure owner assets. While other efforts have been initiated to identify and mitigate disruptive events at the national security level, such as Presidential Policy Directive 41more » (PPD-41), this process is intended to be used by individual organizations to evaluate events that fall below the threshold for a national security. Described another way, Consequence Prioritization considers threats greater than those addressable by standard cyber-hygiene and includes the consideration of events that go beyond a traditional continuity of operations (COOP) perspective. Finally, Consequence Prioritization is most successful when organizations adopt a multi-disciplinary approach, engaging both cyber security and engineering expertise, as in-depth engineering perspectives are required to recognize and characterize and mitigate HCEs. Figure 1 provides a high-level overview of the prioritization process.« less

  11. On-line early fault detection and diagnosis of municipal solid waste incinerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao Jinsong; Huang Jianchao; Sun Wei

    A fault detection and diagnosis framework is proposed in this paper for early fault detection and diagnosis (FDD) of municipal solid waste incinerators (MSWIs) in order to improve the safety and continuity of production. In this framework, principal component analysis (PCA), one of the multivariate statistical technologies, is used for detecting abnormal events, while rule-based reasoning performs the fault diagnosis and consequence prediction, and also generates recommendations for fault mitigation once an abnormal event is detected. A software package, SWIFT, is developed based on the proposed framework, and has been applied in an actual industrial MSWI. The application shows thatmore » automated real-time abnormal situation management (ASM) of the MSWI can be achieved by using SWIFT, resulting in an industrially acceptable low rate of wrong diagnosis, which has resulted in improved process continuity and environmental performance of the MSWI.« less

  12. A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris

    2017-04-01

    Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.

  13. The Application of SNiPER to the JUNO Simulation

    NASA Astrophysics Data System (ADS)

    Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration

    2017-10-01

    The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event reconstruction algorithm development and physics sensitivity studies.

  14. The commodification process of extreme sports: the diffusion of the X-Games by ESPN

    Treesearch

    Chang Huh; Byoung Kwan Lee; Euidong Yoo

    2002-01-01

    The purpose of this study was to explore the commodification process of extreme sports. Specifically, this study is to investigate how X-Games as a sport event has been spread among the teenagers by ESPN in order to use extreme sports commercially. The diffusion theory was utilized as a theoretical framework to explain this process because the diffusion theory is a...

  15. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  16. A network of discrete events for the representation and analysis of diffusion dynamics.

    PubMed

    Pintus, Alberto M; Pazzona, Federico G; Demontis, Pierfranco; Suffritti, Giuseppe B

    2015-11-14

    We developed a coarse-grained description of the phenomenology of diffusive processes, in terms of a space of discrete events and its representation as a network. Once a proper classification of the discrete events underlying the diffusive process is carried out, their transition matrix is calculated on the basis of molecular dynamics data. This matrix can be represented as a directed, weighted network where nodes represent discrete events, and the weight of edges is given by the probability that one follows the other. The structure of this network reflects dynamical properties of the process of interest in such features as its modularity and the entropy rate of nodes. As an example of the applicability of this conceptual framework, we discuss here the physics of diffusion of small non-polar molecules in a microporous material, in terms of the structure of the corresponding network of events, and explain on this basis the diffusivity trends observed. A quantitative account of these trends is obtained by considering the contribution of the various events to the displacement autocorrelation function.

  17. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    USGS Publications Warehouse

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  18. Eventogram: A Visual Representation of Main Events in Biomedical Signals.

    PubMed

    Elgendi, Mohamed

    2016-09-22

    Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.

  19. Developing a Three Processes Framework to Analyze Hydrologic Performance of Urban Stormwater Management in a Watershed Scale

    NASA Astrophysics Data System (ADS)

    Lyu, H.; Ni, G.; Sun, T.

    2016-12-01

    Urban stormwater management contributes to recover water cycle to a nearly natural situation. It is a challenge for analyzing the hydrologic performance in a watershed scale, since the measures are various of sorts and scales and work in different processes. A three processes framework is developed to simplify the urban hydrologic process on the surface and evaluate the urban stormwater management. The three processes include source utilization, transfer regulation and terminal detention, by which the stormwater is controlled in order or discharged. Methods for analyzing performance are based on the water controlled proportions by each process, which are calculated using USEPA Stormwater Management Model. A case study form Beijing is used to illustrate how the performance varies under a set of designed events of different return periods. This framework provides a method to assess urban stormwater management as a whole system considering the interaction between measures, and to examine if there is any weak process of an urban watershed to be improved. The results help to make better solutions of urban water crisis.

  20. SSEL-ADE: A semi-supervised ensemble learning framework for extracting adverse drug events from social media.

    PubMed

    Liu, Jing; Zhao, Songzheng; Wang, Gang

    2018-01-01

    With the development of Web 2.0 technology, social media websites have become lucrative but under-explored data sources for extracting adverse drug events (ADEs), which is a serious health problem. Besides ADE, other semantic relation types (e.g., drug indication and beneficial effect) could hold between the drug and adverse event mentions, making ADE relation extraction - distinguishing ADE relationship from other relation types - necessary. However, conducting ADE relation extraction in social media environment is not a trivial task because of the expertise-dependent, time-consuming and costly annotation process, and the feature space's high-dimensionality attributed to intrinsic characteristics of social media data. This study aims to develop a framework for ADE relation extraction using patient-generated content in social media with better performance than that delivered by previous efforts. To achieve the objective, a general semi-supervised ensemble learning framework, SSEL-ADE, was developed. The framework exploited various lexical, semantic, and syntactic features, and integrated ensemble learning and semi-supervised learning. A series of experiments were conducted to verify the effectiveness of the proposed framework. Empirical results demonstrate the effectiveness of each component of SSEL-ADE and reveal that our proposed framework outperforms most of existing ADE relation extraction methods The SSEL-ADE can facilitate enhanced ADE relation extraction performance, thereby providing more reliable support for pharmacovigilance. Moreover, the proposed semi-supervised ensemble methods have the potential of being applied to effectively deal with other social media-based problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Attention - Control in the Frequentistic Processing of Multidimensional Event Streams.

    DTIC Science & Technology

    1980-07-01

    Human memory. Annual Review of Psychology, 1979, 30, 63-702. Craik , F. I. M., & Lockhart , R. S. Levels of processing : A framework for memory research...1979; Jacoby & Craik , 1979). Thus, the notions of memora- bility (or retrievability) and levels of processing are tied closely in the sense that the...differing levels and degrees of elaborateness (Jacoby & Craik , 1979). Decisions as to which attributes receive elaborated processing and how they are

  2. A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city

    NASA Astrophysics Data System (ADS)

    Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan

    2013-04-01

    We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.

  3. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  4. Consistent simulation of direct-photon production in hadron collisions including associated two-jet production

    NASA Astrophysics Data System (ADS)

    Odaka, Shigeru; Kurihara, Yoshimasa

    2016-05-01

    We have developed an event generator for direct-photon production in hadron collisions, including associated 2-jet production in the framework of the GR@PPA event generator. The event generator consistently combines γ + 2-jet production processes with the lowest-order γ + jet and photon-radiation (fragmentation) processes from quantum chromodynamics (QCD) 2-jet production using a subtraction method. The generated events can be fed to general-purpose event generators to facilitate the addition of hadronization and decay simulations. Using the obtained event information, we can simulate photon isolation and hadron-jet reconstruction at the particle (hadron) level. The simulation reasonably reproduces measurement data obtained at the large hadron collider (LHC) concerning not only the inclusive photon spectrum, but also the correlation between the photon and jet. The simulation implies that the contribution of the γ + 2-jet is very large, especially in low photon-pT ( ≲ 50 GeV) regions. Discrepancies observed at low pT, although marginal, may indicate the necessity for the consideration of further higher-order processes. Unambiguous particle-level definition of the photon-isolation condition for the signal events is desired to be given explicitly in future measurements.

  5. Modeling veterans healthcare administration disclosure processes :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested acrossmore » a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.« less

  6. Introducing concurrency in the Gaudi data processing framework

    NASA Astrophysics Data System (ADS)

    Clemencic, Marco; Hegner, Benedikt; Mato, Pere; Piparo, Danilo

    2014-06-01

    In the past, the increasing demands for HEP processing resources could be fulfilled by the ever increasing clock-frequencies and by distributing the work to more and more physical machines. Limitations in power consumption of both CPUs and entire data centres are bringing an end to this era of easy scalability. To get the most CPU performance per watt, future hardware will be characterised by less and less memory per processor, as well as thinner, more specialized and more numerous cores per die, and rather heterogeneous resources. To fully exploit the potential of the many cores, HEP data processing frameworks need to allow for parallel execution of reconstruction or simulation algorithms on several events simultaneously. We describe our experience in introducing concurrency related capabilities into Gaudi, a generic data processing software framework, which is currently being used by several HEP experiments, including the ATLAS and LHCb experiments at the LHC. After a description of the concurrent framework and the most relevant design choices driving its development, we describe the behaviour of the framework in a more realistic environment, using a subset of the real LHCb reconstruction workflow, and present our strategy and the used tools to validate the physics outcome of the parallel framework against the results of the present, purely sequential LHCb software. We then summarize the measurement of the code performance of the multithreaded application in terms of memory and CPU usage.

  7. New agreement measures based on survival processes

    PubMed Central

    Guo, Ying; Li, Ruosha; Peng, Limin; Manatunga, Amita K.

    2013-01-01

    Summary The need to assess agreement arises in many scenarios in biomedical sciences when measurements were taken by different methods on the same subjects. When the endpoints are survival outcomes, the study of agreement becomes more challenging given the special characteristics of time-to-event data. In this paper, we propose a new framework for assessing agreement based on survival processes that can be viewed as a natural representation of time-to-event outcomes. Our new agreement measure is formulated as the chance-corrected concordance between survival processes. It provides a new perspective for studying the relationship between correlated survival outcomes and offers an appealing interpretation as the agreement between survival times on the absolute distance scale. We provide a multivariate extension of the proposed agreement measure for multiple methods. Furthermore, the new framework enables a natural extension to evaluate time-dependent agreement structure. We develop nonparametric estimation of the proposed new agreement measures. Our estimators are shown to be strongly consistent and asymptotically normal. We evaluate the performance of the proposed estimators through simulation studies and then illustrate the methods using a prostate cancer data example. PMID:23844617

  8. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820

  9. Evolving spatio-temporal data machines based on the NeuCube neuromorphic framework: Design methodology and selected applications.

    PubMed

    Kasabov, Nikola; Scott, Nathan Matthew; Tu, Enmei; Marks, Stefan; Sengupta, Neelava; Capecci, Elisa; Othman, Muhaini; Doborjeh, Maryam Gholami; Murli, Norhanifah; Hartono, Reggio; Espinosa-Ramos, Josafath Israel; Zhou, Lei; Alvi, Fahad Bashir; Wang, Grace; Taylor, Denise; Feigin, Valery; Gulyaev, Sergei; Mahmoud, Mahmoud; Hou, Zeng-Guang; Yang, Jie

    2016-06-01

    The paper describes a new type of evolving connectionist systems (ECOS) called evolving spatio-temporal data machines based on neuromorphic, brain-like information processing principles (eSTDM). These are multi-modular computer systems designed to deal with large and fast spatio/spectro temporal data using spiking neural networks (SNN) as major processing modules. ECOS and eSTDM in particular can learn incrementally from data streams, can include 'on the fly' new input variables, new output class labels or regression outputs, can continuously adapt their structure and functionality, can be visualised and interpreted for new knowledge discovery and for a better understanding of the data and the processes that generated it. eSTDM can be used for early event prediction due to the ability of the SNN to spike early, before whole input vectors (they were trained on) are presented. A framework for building eSTDM called NeuCube along with a design methodology for building eSTDM using this is presented. The implementation of this framework in MATLAB, Java, and PyNN (Python) is presented. The latter facilitates the use of neuromorphic hardware platforms to run the eSTDM. Selected examples are given of eSTDM for pattern recognition and early event prediction on EEG data, fMRI data, multisensory seismic data, ecological data, climate data, audio-visual data. Future directions are discussed, including extension of the NeuCube framework for building neurogenetic eSTDM and also new applications of eSTDM. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The use of Ethics Decision-Making Frameworks by Canadian Ethics Consultants: A Qualitative Study.

    PubMed

    Kaposy, Chris; Brunger, Fern; Maddalena, Victor; Singleton, Richard

    2016-10-01

    In this study, Canadian healthcare ethics consultants describe their use of ethics decision-making frameworks. Our research finds that ethics consultants in Canada use multi-purpose ethics decision-making frameworks, as well as targeted frameworks that focus on reaching an ethical resolution to a particular healthcare issue, such as adverse event reporting, or difficult triage scenarios. Several interviewees mention the influence that the accreditation process in Canadian healthcare organizations has on the adoption and use of such frameworks. Some of the ethics consultants we interviewed also report on their reluctance to use these tools. Limited empirical work has been done previously on the use of ethics decision-making frameworks. This study begins to fill this gap in our understanding of the work of healthcare ethics consultants. © 2016 John Wiley & Sons Ltd.

  11. Defining a risk-informed framework for whole-of-government lessons learned: A Canadian perspective.

    PubMed

    Friesen, Shaye K; Kelsey, Shelley; Legere, J A Jim

    Lessons learned play an important role in emergency management (EM) and organizational agility. Virtually all aspects of EM can derive benefit from a lessons learned program. From major security events to exercises, exploiting and applying lessons learned and "best practices" is critical to organizational resilience and adaptiveness. A robust lessons learned process and methodology provides an evidence base with which to inform decisions, guide plans, strengthen mitigation strategies, and assist in developing tools for operations. The Canadian Safety and Security Program recently supported a project to define a comprehensive framework that would allow public safety and security partners to regularly share event response best practices, and prioritize recommendations originating from after action reviews. This framework consists of several inter-locking elements: a comprehensive literature review/environmental scan of international programs; a survey to collect data from end users and management; the development of a taxonomy for organizing and structuring information; a risk-informed methodology for selecting, prioritizing, and following through on recommendations; and standardized templates and tools for tracking recommendations and ensuring implementation. This article discusses the efforts of the project team, which provided "best practice" advice and analytical support to ensure that a systematic approach to lessons learned was taken by the federal community to improve prevention, preparedness, and response activities. It posits an approach by which one might design a systematic process for information sharing and event response coordination-an approach that will assist federal departments to institutionalize a cross-government lessons learned program.

  12. Managed traffic evacuation using distributed sensor processing

    NASA Astrophysics Data System (ADS)

    Ramuhalli, Pradeep; Biswas, Subir

    2005-05-01

    This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.

  13. The Role of Interpersonal Relations in Healthcare Team Communication and Patient Safety: A Proposed Model of Interpersonal Process in Teamwork.

    PubMed

    Lee, Charlotte Tsz-Sum; Doran, Diane Marie

    2017-06-01

    Patient safety is compromised by medical errors and adverse events related to miscommunications among healthcare providers. Communication among healthcare providers is affected by human factors, such as interpersonal relations. Yet, discussions of interpersonal relations and communication are lacking in healthcare team literature. This paper proposes a theoretical framework that explains how interpersonal relations among healthcare team members affect communication and team performance, such as patient safety. We synthesized studies from health and social science disciplines to construct a theoretical framework that explicates the links among these constructs. From our synthesis, we identified two relevant theories: framework on interpersonal processes based on social relation model and the theory of relational coordination. The former involves three steps: perception, evaluation, and feedback; and the latter captures relational communicative behavior. We propose that manifestations of provider relations are embedded in the third step of the framework on interpersonal processes: feedback. Thus, varying team-member relationships lead to varying collaborative behavior, which affects patient-safety outcomes via a change in team communication. The proposed framework offers new perspectives for understanding how workplace relations affect healthcare team performance. The framework can be used by nurses, administrators, and educators to improve patient safety, team communication, or to resolve conflicts.

  14. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    NASA Astrophysics Data System (ADS)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  15. Integrated, systems metabolic picture of acetone-butanol-ethanol fermentation by Clostridium acetobutylicum.

    PubMed

    Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting

    2015-07-07

    Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels.

  16. Integrated, systems metabolic picture of acetone-butanol-ethanol fermentation by Clostridium acetobutylicum

    PubMed Central

    Liao, Chen; Seo, Seung-Oh; Celik, Venhar; Liu, Huaiwei; Kong, Wentao; Wang, Yi; Blaschek, Hans; Jin, Yong-Su; Lu, Ting

    2015-01-01

    Microbial metabolism involves complex, system-level processes implemented via the orchestration of metabolic reactions, gene regulation, and environmental cues. One canonical example of such processes is acetone-butanol-ethanol (ABE) fermentation by Clostridium acetobutylicum, during which cells convert carbon sources to organic acids that are later reassimilated to produce solvents as a strategy for cellular survival. The complexity and systems nature of the process have been largely underappreciated, rendering challenges in understanding and optimizing solvent production. Here, we present a system-level computational framework for ABE fermentation that combines metabolic reactions, gene regulation, and environmental cues. We developed the framework by decomposing the entire system into three modules, building each module separately, and then assembling them back into an integrated system. During the model construction, a bottom-up approach was used to link molecular events at the single-cell level into the events at the population level. The integrated model was able to successfully reproduce ABE fermentations of the WT C. acetobutylicum (ATCC 824), as well as its mutants, using data obtained from our own experiments and from literature. Furthermore, the model confers successful predictions of the fermentations with various network perturbations across metabolic, genetic, and environmental aspects. From foundation to applications, the framework advances our understanding of complex clostridial metabolism and physiology and also facilitates the development of systems engineering strategies for the production of advanced biofuels. PMID:26100881

  17. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  18. Proteomics for Adverse Outcome Pathway Discovery using Human Kidney Cells?

    EPA Science Inventory

    An Adverse Outcome Pathway (AOP) is a conceptual framework that applies molecular-based data for use in risk assessment and regulatory decision support. AOP development is based on effects data of chemicals on biological processes (i.e., molecular initiating events, key intermedi...

  19. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves as…

  20. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves…

  1. Adapting Concepts from Systems Biology to Develop Systems Exposure Event Networks for Exposure Science Research

    EPA Science Inventory

    Systems exposure science has emerged from the traditional environmental exposure assessment framework and incorporates new concepts that link sources of human exposure to internal dose and metabolic processes. Because many human environmental studies are designed for retrospectiv...

  2. HOW CAN BIOLOGICALLY-BASED MODELING OF ARSENIC KINETICS AND DYNAMICS INFORM THE RISK ASSESSMENT PROCESS?

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...

  3. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  4. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  5. Constructing event trees for volcanic crises

    USGS Publications Warehouse

    Newhall, C.; Hoblitt, R.

    2002-01-01

    Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.

  6. A Biopsychosocial Formulation of Pain Communication

    ERIC Educational Resources Information Center

    Hadjistavropoulos, Thomas; Craig, Kenneth D.; Duck, Steve; Cano, Annmarie; Goubert, Liesbet; Jackson, Philip L.; Mogil, Jeffrey S.; Rainville, Pierre; Sullivan, Michael J. L.; de C. Williams, Amanda C.; Vervoort, Tine; Fitzgerald, Theresa Dever

    2011-01-01

    We present a detailed framework for understanding the numerous and complicated interactions among psychological and social determinants of pain through examination of the process of pain communication. The focus is on an improved understanding of immediate dyadic transactions during painful events in the context of broader social phenomena.…

  7. How Can Biologically-Based Modeling of Arsenic Kinetics and Dynamics Inform the Risk Assessment Process? -- ETD

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  9. Video-Based Analyses of Motivation and Interaction in Science Classrooms

    NASA Astrophysics Data System (ADS)

    Moeller Andersen, Hanne; Nielsen, Birgitte Lund

    2013-04-01

    An analytical framework for examining students' motivation was developed and used for analyses of video excerpts from science classrooms. The framework was developed in an iterative process involving theories on motivation and video excerpts from a 'motivational event' where students worked in groups. Subsequently, the framework was used for an analysis of students' motivation in the whole class situation. A cross-case analysis was carried out illustrating characteristics of students' motivation dependent on the context. This research showed that students' motivation to learn science is stimulated by a range of different factors, with autonomy, relatedness and belonging apparently being the main sources of motivation. The teacher's combined use of questions, uptake and high level evaluation was very important for students' learning processes and motivation, especially students' self-efficacy. By coding and analysing video excerpts from science classrooms, we were able to demonstrate that the analytical framework helped us gain new insights into the effect of teachers' communication and other elements on students' motivation.

  10. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  11. Software framework for the upcoming MMT Observatory primary mirror re-aluminization

    NASA Astrophysics Data System (ADS)

    Gibson, J. Duane; Clark, Dusty; Porter, Dallan

    2014-07-01

    Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.

  12. Temporal and Location Based RFID Event Data Management and Processing

    NASA Astrophysics Data System (ADS)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  13. Automatic optical detection and classification of marine animals around MHK converters using machine vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunton, Steven

    Optical systems provide valuable information for evaluating interactions and associations between organisms and MHK energy converters and for capturing potentially rare encounters between marine organisms and MHK device. The deluge of optical data from cabled monitoring packages makes expert review time-consuming and expensive. We propose algorithms and a processing framework to automatically extract events of interest from underwater video. The open-source software framework consists of background subtraction, filtering, feature extraction and hierarchical classification algorithms. This principle classification pipeline was validated on real-world data collected with an experimental underwater monitoring package. An event detection rate of 100% was achieved using robustmore » principal components analysis (RPCA), Fourier feature extraction and a support vector machine (SVM) binary classifier. The detected events were then further classified into more complex classes – algae | invertebrate | vertebrate, one species | multiple species of fish, and interest rank. Greater than 80% accuracy was achieved using a combination of machine learning techniques.« less

  14. On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Petersson, Karl Magnus

    2008-11-01

    Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.

  15. A Walk down Memory Lane: On the Relationship between Autobiographical Memories and Outdoor Activities

    ERIC Educational Resources Information Center

    Gibson, Joe; Nicholas, Jude

    2018-01-01

    This article highlights a theoretical and practical framework for integrating the neuropsychological concept of autobiographical memory with the experiential learning that takes place in the outdoors. Autobiographical memories, our recollections of specific, personal events, are constructed through a personal narrative process; the way we choose…

  16. Effects of chlorpyrifos and TCP on human kidney cells using toxicity testing and proteomics

    EPA Science Inventory

    An Adverse Outcome Pathway (AOP) is a conceptual framework to apply molecular pathway-based data for use in risk assessment and regulatory decision support. The development of AOPs requires data on the effects of chemicals on biological processes (i.e., molecular initiating event...

  17. Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.

    PubMed

    Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank

    2017-10-01

    Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.

  18. Seismic Search Engine: A distributed database for mining large scale seismic data

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  19. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, S; Kessler, M; Litzenberg, D

    2015-06-15

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those eventsmore » and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from Varian Medical Systems. Other quality projects involving her effort are funded by Blue Cross Blue Shield of Michigan, Breast Cancer Research Foundation, and the NIH.« less

  20. Vision-Based Finger Detection, Tracking, and Event Identification Techniques for Multi-Touch Sensing and Display Systems

    PubMed Central

    Chen, Yen-Lin; Liang, Wen-Yew; Chiang, Chuan-Yen; Hsieh, Tung-Ju; Lee, Da-Cheng; Yuan, Shyan-Ming; Chang, Yang-Lang

    2011-01-01

    This study presents efficient vision-based finger detection, tracking, and event identification techniques and a low-cost hardware framework for multi-touch sensing and display applications. The proposed approach uses a fast bright-blob segmentation process based on automatic multilevel histogram thresholding to extract the pixels of touch blobs obtained from scattered infrared lights captured by a video camera. The advantage of this automatic multilevel thresholding approach is its robustness and adaptability when dealing with various ambient lighting conditions and spurious infrared noises. To extract the connected components of these touch blobs, a connected-component analysis procedure is applied to the bright pixels acquired by the previous stage. After extracting the touch blobs from each of the captured image frames, a blob tracking and event recognition process analyzes the spatial and temporal information of these touch blobs from consecutive frames to determine the possible touch events and actions performed by users. This process also refines the detection results and corrects for errors and occlusions caused by noise and errors during the blob extraction process. The proposed blob tracking and touch event recognition process includes two phases. First, the phase of blob tracking associates the motion correspondence of blobs in succeeding frames by analyzing their spatial and temporal features. The touch event recognition process can identify meaningful touch events based on the motion information of touch blobs, such as finger moving, rotating, pressing, hovering, and clicking actions. Experimental results demonstrate that the proposed vision-based finger detection, tracking, and event identification system is feasible and effective for multi-touch sensing applications in various operational environments and conditions. PMID:22163990

  1. Big data and high-performance analytics in structural health monitoring for bridge management

    NASA Astrophysics Data System (ADS)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  2. A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.

    2017-12-01

    Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).

  3. Flood AI: An Intelligent Systems for Discovery and Communication of Disaster Knowledge

    NASA Astrophysics Data System (ADS)

    Demir, I.; Sermet, M. Y.

    2017-12-01

    Communities are not immune from extreme events or natural disasters that can lead to large-scale consequences for the nation and public. Improving resilience to better prepare, plan, recover, and adapt to disasters is critical to reduce the impacts of extreme events. The National Research Council (NRC) report discusses the topic of how to increase resilience to extreme events through a vision of resilient nation in the year 2030. The report highlights the importance of data, information, gaps and knowledge challenges that needs to be addressed, and suggests every individual to access the risk and vulnerability information to make their communities more resilient. This project presents an intelligent system, Flood AI, for flooding to improve societal preparedness by providing a knowledge engine using voice recognition, artificial intelligence, and natural language processing based on a generalized ontology for disasters with a primary focus on flooding. The knowledge engine utilizes the flood ontology and concepts to connect user input to relevant knowledge discovery channels on flooding by developing a data acquisition and processing framework utilizing environmental observations, forecast models, and knowledge bases. Communication channels of the framework includes web-based systems, agent-based chat bots, smartphone applications, automated web workflows, and smart home devices, opening the knowledge discovery for flooding to many unique use cases.

  4. How Does Distinctive Processing Reduce False Recall?

    PubMed Central

    Hunt, R. Reed; Smith, Rebekah E.; Dunlap, Kathryn R.

    2011-01-01

    False memories arising from associatively related lists are a robust phenomenon that resists many efforts to prevent it. However, a few variables have been shown to reduce this form of false memory. Explanations for how the reduction is accomplished have focused on either output monitoring processes or constraints on access, but neither idea alone is sufficient to explain extant data. Our research was driven by a framework that distinguishes item-based and event-based distinctive processing to account for the effects of different variables on both correct recall of study list items and false recall. We report the results of three experiments examining the effect of a deep orienting task and the effect of visual presentation of study items, both of which have been shown to reduce false recall. The experiments replicate those previous findings and add important new information about the effect of the variables on a recall test that eliminates the need for monitoring. The results clearly indicate that both post-access monitoring and constraints on access contribute to reductions in false memories. The results also showed that the manipulations of study modality and orienting task had different effects on correct and false recall, a pattern that was predicted by the item-based/event-based distinctive processing framework. PMID:22003267

  5. How Does Distinctive Processing Reduce False Recall?

    PubMed

    Hunt, R Reed; Smith, Rebekah E; Dunlap, Kathryn R

    2011-11-01

    False memories arising from associatively related lists are a robust phenomenon that resists many efforts to prevent it. However, a few variables have been shown to reduce this form of false memory. Explanations for how the reduction is accomplished have focused on either output monitoring processes or constraints on access, but neither idea alone is sufficient to explain extant data. Our research was driven by a framework that distinguishes item-based and event-based distinctive processing to account for the effects of different variables on both correct recall of study list items and false recall. We report the results of three experiments examining the effect of a deep orienting task and the effect of visual presentation of study items, both of which have been shown to reduce false recall. The experiments replicate those previous findings and add important new information about the effect of the variables on a recall test that eliminates the need for monitoring. The results clearly indicate that both post-access monitoring and constraints on access contribute to reductions in false memories. The results also showed that the manipulations of study modality and orienting task had different effects on correct and false recall, a pattern that was predicted by the item-based/event-based distinctive processing framework.

  6. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auld, Joshua; Hope, Michael; Ley, Hubert

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less

  7. A psychosocial risk assessment and management framework to enhance response to CBRN terrorism threats and attacks.

    PubMed

    Lemyre, Louise; Clément, Mélanie; Corneil, Wayne; Craig, Lorraine; Boutette, Paul; Tyshenko, Michael; Karyakina, Nataliya; Clarke, Robert; Krewski, Daniel

    2005-01-01

    Evidence in the disaster mental health literature indicates that psychosocial consequences of terrorism are a critical component of chemical, biological, radiological, and nuclear (CBRN) events, both at the clinical level and the normal behavioral and emotional levels. Planning for such psychosocial aspects should be an integral part of emergency preparedness. As Canada and other countries build the capacity to prevent, mitigate, and manage CBRN threats and events, it is important to recognize the range of social, psychological, emotional, spiritual, behavioral, and cognitive factors that may affect victims and their families, communities, children, the elderly, responders, decision makers, and others at all phases of terrorism, from threat to post-impact recovery. A structured process to assist CBRN emergency planners, decision makers, and responders in identifying psychosocial risks, vulnerable populations, resources, and interventions at various phases of a CBRN event to limit negative psychosocial impacts and promote resilience and adaptive responses is the essence of our psychosocial risk assessment and management (P-RAM) framework. This article presents the evidence base and conceptual underpinnings of the framework, the principles underlying its design, its key elements, and its use in the development of decision tools for responders, planners, decision makers, and the general public to better assess and manage psychosocial aspects of CBRN threats or attacks.

  8. Not the last word: dissemination strategies for patient-centred research in nursing.

    PubMed

    Hagan, Teresa L; Schmidt, Karen; Ackison, Guyanna R; Murphy, Megan; Jones, Jennifer R

    2017-08-01

    Research results hold value for many stakeholders including researchers, patient populations, advocacy organizations, and community groups. The aim of this study is to describe our research team's systematic process to designing a dissemination strategy for a completed research study. We organized a dissemination event to feed the results of our study to participants and stakeholders and collect feedback regarding our study. We applied the Agency for Healthcare Research and Quality's dissemination framework to guide the development of the event and collected participant feedback during the event. We describe our dissemination strategy along with attendees' feedback and suggestions for our research as an example of a way to design a patient- and community-focused dissemination. We explain the details of our dissemination strategy including (a) our process of reporting a large research study into a stakeholder event, (b) stakeholder feedback collected at the event, and (c) the translation of feedback into our research team's research. We also describe challenges encountered during the dissemination process and ways to handle issues such as logistics, funding, and staff. This analysis provides key insights and practical advice for researchers looking for innovative ways to disseminate their findings within the lay and scientific communities.

  9. Shaping Social Activity by Incentivizing Users

    PubMed Central

    Farajtabar, Mehrdad; Du, Nan; Rodriguez, Manuel Gomez; Valera, Isabel; Zha, Hongyuan; Song, Le

    2015-01-01

    Events in an online social network can be categorized roughly into endogenous events, where users just respond to the actions of their neighbors within the network, or exogenous events, where users take actions due to drives external to the network. How much external drive should be provided to each user, such that the network activity can be steered towards a target state? In this paper, we model social events using multivariate Hawkes processes, which can capture both endogenous and exogenous event intensities, and derive a time dependent linear relation between the intensity of exogenous events and the overall network activity. Exploiting this connection, we develop a convex optimization framework for determining the required level of external drive in order for the network to reach a desired activity level. We experimented with event data gathered from Twitter, and show that our method can steer the activity of the network more accurately than alternatives. PMID:26005312

  10. Tinnitus What and Where: An Ecological Framework

    PubMed Central

    Searchfield, Grant D.

    2014-01-01

    Tinnitus is an interaction of the environment, cognition, and plasticity. The connection between the individual with tinnitus and their world seldom receives attention in neurophysiological research. As well as changes in cell excitability, an individual’s culture and beliefs, and work and social environs may all influence how tinnitus is perceived. In this review, an ecological framework for current neurophysiological evidence is considered. The model defines tinnitus as the perception of an auditory object in the absence of an acoustic event. It is hypothesized that following deafferentation: adaptive feature extraction, schema, and semantic object formation processes lead to tinnitus in a manner predicted by Adaptation Level Theory (1, 2). Evidence from physiological studies is compared to the tenants of the proposed ecological model. The consideration of diverse events within an ecological context may unite seemingly disparate neurophysiological models. PMID:25566177

  11. Quaternary geophysical framework of the northeastern North Carolina coastal system

    USGS Publications Warehouse

    Thieler, E.R.; Foster, D.S.; Mallinson, D.M.; Himmelstoss, E.A.; McNinch, J.E.; List, J.H.; Hammar-Klose, E.S.

    2013-01-01

    The northeastern North Carolina coastal system, from False Cape, Virginia, to Cape Lookout, North Carolina, has been studied by a cooperative research program that mapped the Quaternary geologic framework of the estuaries, barrier islands, and inner continental shelf. This information provides a basis to understand the linkage between geologic framework, physical processes, and coastal evolution at time scales from storm events to millennia. The study area attracts significant tourism to its parks and beaches, contains a number of coastal communities, and supports a local fishing industry, all of which are impacted by coastal change. Knowledge derived from this research program can be used to mitigate hazards and facilitate effective management of this dynamic coastal system.

  12. A Framework for Collaborative Review of Candidate Events in High Data Rate Streams: the V-Fastr Experiment as a Case Study

    NASA Astrophysics Data System (ADS)

    Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.; Thompson, David R.; Mattmann, Chris A.; Wagstaff, Kiri; Lazio, Joseph; Jones, Dayton

    2015-01-01

    “Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be stored from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.

  13. Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework,more » to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.« less

  14. Preparedness and response to terrorism: a framework for public health action.

    PubMed

    Gofin, Rosa

    2005-02-01

    Political group violence in the form of terrorist actions has become a reality worldwide, affecting the health and economies of populations. As a consequence, preparedness and response are becoming an integral part of public health action. Risk appraisal, preservation of human and civil rights and communications within and between countries are all issues to be considered in the process. The combination of the natural history of terrorist actions and the epidemiological triangle model has been adapted in this paper and suggested as a comprehensive approach for preparedness and action. It covers preparedness (pre-event), response (event) and the consequences (post-event) of a terrorist attack. It takes into account the human factor, vectors and environment involved in each one of the phases. Terrorism is a global reality with varying underlying causes, manifestations and impact on the health of the public. Preparedness, response and rehabilitation are an integral part of public health action. Consideration of the pre-event, event and post-event phases in terrorist actions, together with the human factor, vector/agent and environment in each of these phases, offers a framework for public health preparedness, response and rehabilitation. Planning should consider risk assessment, risk communication, inter-sectorial cooperation, enactment of laws and regulations which consider protection of the public's health and civil liberties. Allocation of resources would need to make allowance for maintenance and development of ongoing public health activities.

  15. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  16. Online data monitoring in the LHCb experiment

    NASA Astrophysics Data System (ADS)

    Callot, O.; Cherukuwada, S.; Frank, M.; Gaspar, C.; Graziani, G.; Herwijnen, E. v.; Jost, B.; Neufeld, N.; P-Altarelli, M.; Somogyi, P.; Stoica, R.

    2008-07-01

    The High Level Trigger and Data Acquisition system selects about 2 kHz of events out of the 40 MHz of beam crossings. The selected events are sent to permanent storage for subsequent analysis. In order to ensure the quality of the collected data, identify possible malfunctions of the detector and perform calibration and alignment checks, a small fraction of the accepted events is sent to a monitoring farm, which consists of a few tens of general purpose processors. This contribution introduces the architecture of the data stream splitting mechanism from the storage system to the monitoring farm, where the raw data are analyzed by dedicated tasks. It describes the collaborating software components that are all based on the Gaudi event processing framework.

  17. Researching Writing Events: Using Mediated Discourse Analysis to Explore How Students Write Together

    ERIC Educational Resources Information Center

    Rish, Ryan M.

    2015-01-01

    This article addresses how mediated discourse theory and related analytical tools can be used to explore how students write together. Considered within a sociocultural framework that conceptualises writing as involving distributed, mediated and dialogic processes of invention, this article presents an investigation of how three high school…

  18. Mixed Resilience: A Study of Multiethnic Mexican American Stress and Coping in Arizona

    ERIC Educational Resources Information Center

    Jackson, Kelly F.; Wolven, Thera; Aguilera, Kimberly

    2013-01-01

    Guided by an integrated framework of resilience, this in-depth qualitative study examined the major stressors persons of multiethnic Mexican American heritage encountered in their social environments related to their mixed identity and the resilience enhancing processes they employed to cope with these stressors. Life-story event narratives were…

  19. On the Early Left-Anterior Negativity (ELAN) in Syntax Studies

    ERIC Educational Resources Information Center

    Steinhauer, Karsten; Drury, John E.

    2012-01-01

    Within the framework of Friederici's (2002) neurocognitive model of sentence processing, the early left anterior negativity (ELAN) in event-related potentials (ERPs) has been claimed to be a brain marker of syntactic first-pass parsing. As ELAN components seem to be exclusively elicited by word category violations (phrase structure violations),…

  20. Automatic processing of induced events in the geothermal reservoirs Landau and Insheim, Germany

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Küperkoch, Ludger; Meier, Thomas

    2016-04-01

    Induced events can be a risk to local infrastructure that need to be understood and evaluated. They represent also a chance to learn more about the reservoir behavior and characteristics. Prior to the analysis, the waveform data must be processed consistently and accurately to avoid erroneous interpretations. In the framework of the MAGS2 project an automatic off-line event detection and a phase onset time determination algorithm are applied to induced seismic events in geothermal systems in Landau and Insheim, Germany. The off-line detection algorithm works based on a cross-correlation of continuous data taken from the local seismic network with master events. It distinguishes events between different reservoirs and within the individual reservoirs. Furthermore, it provides a location and magnitude estimation. Data from 2007 to 2014 are processed and compared with other detections using the SeisComp3 cross correlation detector and a STA/LTA detector. The detected events are analyzed concerning spatial or temporal clustering. Furthermore the number of events are compared to the existing detection lists. The automatic phase picking algorithm combines an AR-AIC approach with a cost function to find precise P1- and S1-phase onset times which can be used for localization and tomography studies. 800 induced events are processed, determining 5000 P1- and 6000 S1-picks. The phase onset times show a high precision with mean residuals to manual phase picks of 0s (P1) to 0.04s (S1) and standard deviations below ±0.05s. The received automatic picks are applied to relocate a selected number of events to evaluate influences on the location precision.

  1. A Matched Field Processing Framework for Coherent Detection Over Local and Regional Networks

    DTIC Science & Technology

    2011-06-01

    Northern Finland Seismological Network, FN) and to the University of Helsinki for data from the VRF and HEF stations (part of the Finnish seismograph ...shows the results of classification with the FK measurement . Most of the events are incorrectly assigned to one particular mine (K2 – Rasvumchorr...generalization of the single-phase matched field processing method that encodes the full structure of the entire wavefield? What would this

  2. The Offline Software Framework of the NA61/SHINE Experiment

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar

    2012-12-01

    NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.

  3. Tracking Progress in Improving Diagnosis: A Framework for Defining Undesirable Diagnostic Events.

    PubMed

    Olson, Andrew P J; Graber, Mark L; Singh, Hardeep

    2018-01-29

    Diagnostic error is a prevalent, harmful, and costly phenomenon. Multiple national health care and governmental organizations have recently identified the need to improve diagnostic safety as a high priority. A major barrier, however, is the lack of standardized, reliable methods for measuring diagnostic safety. Given the absence of reliable and valid measures for diagnostic errors, we need methods to help establish some type of baseline diagnostic performance across health systems, as well as to enable researchers and health systems to determine the impact of interventions for improving the diagnostic process. Multiple approaches have been suggested but none widely adopted. We propose a new framework for identifying "undesirable diagnostic events" (UDEs) that health systems, professional organizations, and researchers could further define and develop to enable standardized measurement and reporting related to diagnostic safety. We propose an outline for UDEs that identifies both conditions prone to diagnostic error and the contexts of care in which these errors are likely to occur. Refinement and adoption of this framework across health systems can facilitate standardized measurement and reporting of diagnostic safety.

  4. Frameworks for organizing exposure and toxicity data - the Aggregate Exposure Pathway (AEP) and the Adverse Outcome Pathway (AOP)

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework organizes existing knowledge regarding a series of biological events, starting with a molecular initiating event (MIE) and ending at an adverse outcome. The AOP framework provides a biological context to interpret in vitro toxicity dat...

  5. An Efficient and Imperfect Model for Gravel-Bed Braided River Morphodynamics: Numerical Simulations as Exploratory Tools

    NASA Astrophysics Data System (ADS)

    Kasprak, A.; Brasington, J.; Hafen, K.; Wheaton, J. M.

    2015-12-01

    Numerical models that predict channel evolution through time are an essential tool for investigating processes that occur over timescales which render field observation intractable. However, available morphodynamic models generally take one of two approaches to the complex problem of computing morphodynamics, resulting in oversimplification of the relevant physics (e.g. cellular models) or faithful, yet computationally intensive, representations of the hydraulic and sediment transport processes at play. The practical implication of these approaches is that river scientists must often choose between unrealistic results, in the case of the former, or computational demands that render modeling realistic spatiotemporal scales of channel evolution impossible. Here we present a new modeling framework that operates at the timescale of individual competent flows (e.g. floods), and uses a highly-simplified sediment transport routine that moves volumes of material according to morphologically-derived characteristic transport distances, or path lengths. Using this framework, we have constructed an open-source morphodynamic model, termed MoRPHED, which is here applied, and its validity investigated, at timescales ranging from a single event to a decade on two braided rivers in the UK and New Zealand. We do not purport that MoRPHED is the best, nor even an adequate, tool for modeling braided river dynamics at this range of timescales. Rather, our goal in this research is to explore the utility, feasibility, and sensitivity of an event-scale, path-length-based modeling framework for predicting braided river dynamics. To that end, we further explore (a) which processes are naturally emergent and which must be explicitly parameterized in the model, (b) the sensitivity of the model to the choice of particle travel distance, and (c) whether an event-scale model timestep is adequate for producing braided channel dynamics. The results of this research may inform techniques for future morphodynamic modeling that seeks to maximize computational resources while modeling fluvial dynamics at the timescales of change.

  6. Pathological Narcissism and Interpersonal Behavior in Daily Life

    PubMed Central

    Roche, Michael J.; Pincus, Aaron L.; Conroy, David E.; Hyde, Amanda L.; Ram, Nilam

    2014-01-01

    The Cognitive-Affective Processing System (CAPS) has been proposed as a useful meta-framework for integrating contextual differences in situations with individual differences in personality pathology. In this article, we evaluated the potential of combining the CAPS meta-framework and contemporary interpersonal theory to investigate how individual differences in pathological narcissism influenced interpersonal functioning in daily life. University students (N = 184) completed event-contingent reports about interpersonal interactions across a 7-day diary study. Using multilevel regression models, we found that combinations of narcissistic expression (grandiosity, vulnerability) were associated with different interpersonal behavior patterns reflective of interpersonal dysfunction. These results are among the first to empirically demonstrate the usefulness of the CAPS model to conceptualize personality pathology through the patterning of if-then interpersonal processes. PMID:23205698

  7. Bonsai: an event-based framework for processing and controlling data streams

    PubMed Central

    Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.

    2015-01-01

    The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861

  8. The interactions of multisensory integration with endogenous and exogenous attention

    PubMed Central

    Tang, Xiaoyu; Wu, Jinglong; Shen, Yong

    2016-01-01

    Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner. PMID:26546734

  9. The interactions of multisensory integration with endogenous and exogenous attention.

    PubMed

    Tang, Xiaoyu; Wu, Jinglong; Shen, Yong

    2016-02-01

    Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Activating clinical trials: a process improvement approach.

    PubMed

    Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin

    2016-02-24

    The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.

  11. An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases

    NASA Astrophysics Data System (ADS)

    Ramaswamy, V.; Saleh, F.

    2017-12-01

    Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.

  12. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  13. Recent updates in developing a statistical pseudo-dynamic source-modeling framework to capture the variability of earthquake rupture scenarios

    NASA Astrophysics Data System (ADS)

    Song, Seok Goo; Kwak, Sangmin; Lee, Kyungbook; Park, Donghee

    2017-04-01

    It is a critical element to predict the intensity and variability of strong ground motions in seismic hazard assessment. The characteristics and variability of earthquake rupture process may be a dominant factor in determining the intensity and variability of near-source strong ground motions. Song et al. (2014) demonstrated that the variability of earthquake rupture scenarios could be effectively quantified in the framework of 1-point and 2-point statistics of earthquake source parameters, constrained by rupture dynamics and past events. The developed pseudo-dynamic source modeling schemes were also validated against the recorded ground motion data of past events and empirical ground motion prediction equations (GMPEs) at the broadband platform (BBP) developed by the Southern California Earthquake Center (SCEC). Recently we improved the computational efficiency of the developed pseudo-dynamic source-modeling scheme by adopting the nonparametric co-regionalization algorithm, introduced and applied in geostatistics initially. We also investigated the effect of earthquake rupture process on near-source ground motion characteristics in the framework of 1-point and 2-point statistics, particularly focusing on the forward directivity region. Finally we will discuss whether the pseudo-dynamic source modeling can reproduce the variability (standard deviation) of empirical GMPEs and the efficiency of 1-point and 2-point statistics to address the variability of ground motions.

  14. The memory remains: Understanding collective memory in the digital age

    PubMed Central

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-01-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects. PMID:28435881

  15. The memory remains: Understanding collective memory in the digital age.

    PubMed

    García-Gavilanes, Ruth; Mollgaard, Anders; Tsvetkova, Milena; Yasseri, Taha

    2017-04-01

    Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. The Internet also provides us with a great opportunity to study memory using transactional large-scale data in a quantitative framework similar to the practice in natural sciences. We make use of online data by analyzing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory-triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that, on average, the secondary flow of attention to past events generated by these remembering processes is larger than the primary attention flow to the current event. We report these previously unknown cascading effects.

  16. Chronology of processes in high-gradient channels of medium-high mountains and their influence on the properties of alluvial fans

    NASA Astrophysics Data System (ADS)

    Šilhán, Karel

    2014-02-01

    High-gradient channels are the locations of the greatest geomorphological activity in medium-high mountains. The channels' frequency and character influence the contemporary morphology and morphometry of alluvial fans. There is currently no detailed information regarding the frequency of these processes in high-gradient channels and the evolution of alluvial fans in medium-high mountains in Central Europe. This study in the Moravskoslezské Beskydy Mts. analysed 22 alluvial fans (10 debris flow fans and 12 fluvial fans). The processes occurring on the fans were dated using dendrogeomorphological methods. A total of 748 increment cores were taken from 374 trees to reconstruct 153 geomorphological process events (60 debris flow and 93 floods). The frequency of the processes has been considerably increasing in the last four decades, which can be related to extensive tree cutting since the 1970s. Processes in high-gradient channels in the region (affecting the alluvial fans across the mountain range) are predominantly controlled by cyclonal activity during the warm periods of the year. Probable triggers of local events are heavy downpours in the summer. In addition, spring snowmelt has been identified as occasionally important. This study of the relations affecting the type and frequency of the processes and their effect on the properties of alluvial fans led to the creation of a universal framework for the medium-high flysch mountains of Central Europe. The framework particularly reflects the influence of the character of hydrometeorological extremes on the frequency and type of processes and their reflection in the properties of alluvial fans.

  17. Affective bias as a rational response to the statistics of rewards and punishments.

    PubMed

    Pulcu, Erdem; Browning, Michael

    2017-10-04

    Affective bias, the tendency to differentially prioritise the processing of negative relative to positive events, is commonly observed in clinical and non-clinical populations. However, why such biases develop is not known. Using a computational framework, we investigated whether affective biases may reflect individuals' estimates of the information content of negative relative to positive events. During a reinforcement learning task, the information content of positive and negative outcomes was manipulated independently by varying the volatility of their occurrence. Human participants altered the learning rates used for the outcomes selectively, preferentially learning from the most informative. This behaviour was associated with activity of the central norepinephrine system, estimated using pupilometry, for loss outcomes. Humans maintain independent estimates of the information content of distinct positive and negative outcomes which may bias their processing of affective events. Normalising affective biases using computationally inspired interventions may represent a novel approach to treatment development.

  18. Affective bias as a rational response to the statistics of rewards and punishments

    PubMed Central

    Pulcu, Erdem

    2017-01-01

    Affective bias, the tendency to differentially prioritise the processing of negative relative to positive events, is commonly observed in clinical and non-clinical populations. However, why such biases develop is not known. Using a computational framework, we investigated whether affective biases may reflect individuals’ estimates of the information content of negative relative to positive events. During a reinforcement learning task, the information content of positive and negative outcomes was manipulated independently by varying the volatility of their occurrence. Human participants altered the learning rates used for the outcomes selectively, preferentially learning from the most informative. This behaviour was associated with activity of the central norepinephrine system, estimated using pupilometry, for loss outcomes. Humans maintain independent estimates of the information content of distinct positive and negative outcomes which may bias their processing of affective events. Normalising affective biases using computationally inspired interventions may represent a novel approach to treatment development. PMID:28976304

  19. Topology of magnetic flux ropes and formation of fossil flux transfer events and boundary layer plasmas

    NASA Technical Reports Server (NTRS)

    Lee, L. C.; Ma, Z. W.; Fu, Z. F.; Otto, A.

    1993-01-01

    A mechanism for the formation of fossil flux transfer events and the low-level boundary layer within the framework of multiple X-line reconnection is proposed. Attention is given to conditions for which the bulk of magnetic flux in a flux rope of finite extent has a simple magnetic topology, where the four possible connections of magnetic field lines are: IMF to MSP, MSP to IMF, IMF to IMF, and MSP to MSP. For a sufficient relative shift of the X lines, magnetic flux may enter a flux rope from the magnetosphere and exit into the magnetosphere. This process leads to the formation of magnetic flux ropes which contain a considerable amount of magnetosheath plasma on closed magnetospheric field lines. This process is discussed as a possible explanation for the formation of fossil flux transfer events in the magnetosphere and the formation of the low-latitude boundary layer.

  20. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  1. Psychological Trauma and LGBT Caregivers: A Conceptual Framework to Guide Practice.

    PubMed

    Glaesser, Richard S; Patel, Bina R

    2016-01-01

    LGBT adults face unique risk factors such as social isolation, discrimination, and victimization, and occasionally th ey engage in detrimental behaviors like high alcohol and drug use and risky sexual activity that negatively impacts psychological/physical health. These risks can affect their overall health and stress the relationship with an older caregiver/recipient-partner following exposure to acute medical event. The experience of an acute medical event among a LGBT caregiving partner can result in psychological trauma. In this article the authors present a conceptual framework involving stress process theory, life course theory, and family systems perspective to understand the effect of stressors on LGBT caregiving partners. Implications for social work practice include assessing, coordinating care, counseling and negotiating services at micro level, engaging family-centered approaches to support positive transition to caregiving role at mezzo level, and advocating for policy and cultural shifts to supports and diminish stigma of this group.

  2. Friendship Dissolution Within Social Networks Modeled Through Multilevel Event History Analysis

    PubMed Central

    Dean, Danielle O.; Bauer, Daniel J.; Prinstein, Mitchell J.

    2018-01-01

    A social network perspective can bring important insight into the processes that shape human behavior. Longitudinal social network data, measuring relations between individuals over time, has become increasingly common—as have the methods available to analyze such data. A friendship duration model utilizing discrete-time multilevel survival analysis with a multiple membership random effect structure is developed and applied here to study the processes leading to undirected friendship dissolution within a larger social network. While the modeling framework is introduced in terms of understanding friendship dissolution, it can be used to understand microlevel dynamics of a social network more generally. These models can be fit with standard generalized linear mixed-model software, after transforming the data to a pair-period data set. An empirical example highlights how the model can be applied to understand the processes leading to friendship dissolution between high school students, and a simulation study is used to test the use of the modeling framework under representative conditions that would be found in social network data. Advantages of the modeling framework are highlighted, and potential limitations and future directions are discussed. PMID:28463022

  3. Parametric models to relate spike train and LFP dynamics with neural information processing.

    PubMed

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.

  4. Linking Emotion Regulation Strategies to Affective Events and Negative Emotions at Work

    ERIC Educational Resources Information Center

    Diefendorff, James M.; Richard, Erin M.; Yang, Jixia

    2008-01-01

    This study examined the use of specific forms of emotion regulation at work, utilizing Gross's [Gross, J. J. (1998). "The emerging field of emotion regulation: An integrative review." "Review of General Psychology" 2, 271-299] process-based framework of emotion regulation as a guiding structure. In addition to examining employee self-reported…

  5. Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology

    NASA Astrophysics Data System (ADS)

    Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.

    2015-12-01

    Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.

  6. Overview of EVE - the event visualization environment of ROOT

    NASA Astrophysics Data System (ADS)

    Tadel, Matevž

    2010-04-01

    EVE is a high-level visualization library using ROOT's data-processing, GUI and OpenGL interfaces. It is designed as a framework for object management offering hierarchical data organization, object interaction and visualization via GUI and OpenGL representations. Automatic creation of 2D projected views is also supported. On the other hand, it can serve as an event visualization toolkit satisfying most HEP requirements: visualization of geometry, simulated and reconstructed data such as hits, clusters, tracks and calorimeter information. Special classes are available for visualization of raw-data. Object-interaction layer allows for easy selection and highlighting of objects and their derived representations (projections) across several views (3D, Rho-Z, R-Phi). Object-specific tooltips are provided in both GUI and GL views. The visual-configuration layer of EVE is built around a data-base of template objects that can be applied to specific instances of visualization objects to ensure consistent object presentation. The data-base can be retrieved from a file, edited during the framework operation and stored to file. EVE prototype was developed within the ALICE collaboration and has been included into ROOT in December 2007. Since then all EVE components have reached maturity. EVE is used as the base of AliEve visualization framework in ALICE, Firework physics-oriented event-display in CMS, and as the visualization engine of FairRoot in FAIR.

  7. A study of an adaptive replication framework for orchestrated composite web services.

    PubMed

    Mohamed, Marwa F; Elyamany, Hany F; Nassar, Hamed M

    2013-01-01

    Replication is considered one of the most important techniques to improve the Quality of Services (QoS) of published Web Services. It has achieved impressive success in managing resource sharing and usage in order to moderate the energy consumed in IT environments. For a robust and successful replication process, attention should be paid to suitable time as well as the constraints and capabilities in which the process runs. The replication process is time-consuming since outsourcing some new replicas into other hosts is lengthy. Furthermore, nowadays, most of the business processes that might be implemented over the Web are composed of multiple Web services working together in two main styles: Orchestration and Choreography. Accomplishing a replication over such business processes is another challenge due to the complexity and flexibility involved. In this paper, we present an adaptive replication framework for regular and orchestrated composite Web services. The suggested framework includes a number of components for detecting unexpected and unhappy events that might occur when consuming the original published web services including failure or overloading. It also includes a specific replication controller to manage the replication process and select the best host that would encapsulate a new replica. In addition, it includes a component for predicting the incoming load in order to decrease the time needed for outsourcing new replicas, enhancing the performance greatly. A simulation environment has been created to measure the performance of the suggested framework. The results indicate that adaptive replication with prediction scenario is the best option for enhancing the performance of the replication process in an online business environment.

  8. Modelling tidewater glacier calving: from detailed process models to simple calving laws

    NASA Astrophysics Data System (ADS)

    Benn, Doug; Åström, Jan; Zwinger, Thomas; Todd, Joe; Nick, Faezeh

    2017-04-01

    The simple calving laws currently used in ice sheet models do not adequately reflect the complexity and diversity of calving processes. To be effective, calving laws must be grounded in a sound understanding of how calving actually works. We have developed a new approach to formulating calving laws, using a) the Helsinki Discrete Element Model (HiDEM) to explicitly model fracture and calving processes, and b) the full-Stokes continuum model Elmer/Ice to identify critical stress states associated with HiDEM calving events. A range of observed calving processes emerges spontaneously from HiDEM in response to variations in ice-front buoyancy and the size of subaqueous undercuts, and we show that HiDEM calving events are associated with characteristic stress patterns simulated in Elmer/Ice. Our results open the way to developing calving laws that properly reflect the diversity of calving processes, and provide a framework for a unified theory of the calving process continuum.

  9. Not the last word: dissemination strategies for patient-centred research in nursing

    PubMed Central

    Hagan, Teresa L.; Schmidt, Karen; Ackison, Guyanna R.; Murphy, Megan; Jones, Jennifer R.

    2017-01-01

    Introduction Research results hold value for many stakeholders including researchers, patient populations, advocacy organizations, and community groups. The aim of this study is to describe our research team’s systematic process to designing a dissemination strategy for a completed research study. Methodology We organized a dissemination event to feed the results of our study to participants and stakeholders and collect feedback regarding our study. We applied the Agency for Healthcare Research and Quality’s dissemination framework to guide the development of the event and collected participant feedback during the event. Results We describe our dissemination strategy along with attendees’ feedback and suggestions for our research as an example of a way to design a patient- and community-focused dissemination. We explain the details of our dissemination strategy including (a) our process of reporting a large research study into a stakeholder event, (b) stakeholder feedback collected at the event, and (c) the translation of feedback into our research team’s research. We also describe challenges encountered during the dissemination process and ways to handle issues such as logistics, funding, and staff. Conclusions This analysis provides key insights and practical advice for researchers looking for innovative ways to disseminate their findings within the lay and scientific communities. PMID:29081824

  10. Modeling Array Stations in SIG-VISA

    NASA Astrophysics Data System (ADS)

    Ding, N.; Moore, D.; Russell, S.

    2013-12-01

    We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.

  11. A framework for collaborative review of candidate events in high data rate streams: The V-FASTR experiment as a case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Andrew F.; Cinquini, Luca; Khudikyan, Shakeh E.

    2015-01-01

    “Fast radio transients” are defined here as bright millisecond pulses of radio-frequency energy. These short-duration pulses can be produced by known objects such as pulsars or potentially by more exotic objects such as evaporating black holes. The identification and verification of such an event would be of great scientific value. This is one major goal of the Very Long Baseline Array (VLBA) Fast Transient Experiment (V-FASTR), a software-based detection system installed at the VLBA. V-FASTR uses a “commensal” (piggy-back) approach, analyzing all array data continually during routine VLBA observations and identifying candidate fast transient events. Raw data can be storedmore » from a buffer memory, which enables a comprehensive off-line analysis. This is invaluable for validating the astrophysical origin of any detection. Candidates discovered by the automatic system must be reviewed each day by analysts to identify any promising signals that warrant a more in-depth investigation. To support the timely analysis of fast transient detection candidates by V-FASTR scientists, we have developed a metadata-driven, collaborative candidate review framework. The framework consists of a software pipeline for metadata processing composed of both open source software components and project-specific code written expressly to extract and catalog metadata from the incoming V-FASTR data products, and a web-based data portal that facilitates browsing and inspection of the available metadata for candidate events extracted from the VLBA radio data.« less

  12. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  13. Automatic event recognition and anomaly detection with attribute grammar by learning scene semantics

    NASA Astrophysics Data System (ADS)

    Qi, Lin; Yao, Zhenyu; Li, Li; Dong, Junyu

    2007-11-01

    In this paper we present a novel framework for automatic event recognition and abnormal behavior detection with attribute grammar by learning scene semantics. This framework combines learning scene semantics by trajectory analysis and constructing attribute grammar-based event representation. The scene and event information is learned automatically. Abnormal behaviors that disobey scene semantics or event grammars rules are detected. By this method, an approach to understanding video scenes is achieved. Further more, with this prior knowledge, the accuracy of abnormal event detection is increased.

  14. Application of a temporal reasoning framework tool in analysis of medical device adverse events.

    PubMed

    Clark, Kimberly K; Sharma, Deepak K; Chute, Christopher G; Tao, Cui

    2011-01-01

    The Clinical Narrative Temporal Relation Ontology (CNTRO)1 project offers a semantic-web based reasoning framework, which represents temporal events and relationships within clinical narrative texts, and infer new knowledge over them. In this paper, the CNTRO reasoning framework is applied to temporal analysis of medical device adverse event files. One specific adverse event was used as a test case: late stent thrombosis. Adverse event narratives were obtained from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database2. 15 adverse event files in which late stent thrombosis was confirmed were randomly selected across multiple drug eluting stent devices. From these files, 81 events and 72 temporal relations were annotated. 73 temporal questions were generated, of which 65 were correctly answered by the CNTRO system. This results in an overall accuracy of 89%. This system should be pursued further to continue assessing its potential benefits in temporal analysis of medical device adverse events.

  15. Event Display for the Visualization of CMS Events

    NASA Astrophysics Data System (ADS)

    Bauerdick, L. A. T.; Eulisse, G.; Jones, C. D.; Kovalskyi, D.; McCauley, T.; Mrak Tadel, A.; Muelmenstaedt, J.; Osborne, I.; Tadel, M.; Tu, Y.; Yagil, A.

    2011-12-01

    During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.

  16. Magnetic storms and solar flares: can be analysed within similar mathematical framework with other extreme events?

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2015-04-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.

  17. Cross scale interactions, nonlinearities, and forecasting catastrophic events

    USGS Publications Warehouse

    Peters, Debra P.C.; Pielke, Roger A.; Bestelmeyer, Brandon T.; Allen, Craig D.; Munson-McGee, Stuart; Havstad, Kris M.

    2004-01-01

    Catastrophic events share characteristic nonlinear behaviors that are often generated by cross-scale interactions and feedbacks among system elements. These events result in surprises that cannot easily be predicted based on information obtained at a single scale. Progress on catastrophic events has focused on one of the following two areas: nonlinear dynamics through time without an explicit consideration of spatial connectivity [Holling, C. S. (1992) Ecol. Monogr. 62, 447–502] or spatial connectivity and the spread of contagious processes without a consideration of cross-scale interactions and feedbacks [Zeng, N., Neeling, J. D., Lau, L. M. & Tucker, C. J. (1999) Science 286, 1537–1540]. These approaches rarely have ventured beyond traditional disciplinary boundaries. We provide an interdisciplinary, conceptual, and general mathematical framework for understanding and forecasting nonlinear dynamics through time and across space. We illustrate the generality and usefulness of our approach by using new data and recasting published data from ecology (wildfires and desertification), epidemiology (infectious diseases), and engineering (structural failures). We show that decisions that minimize the likelihood of catastrophic events must be based on cross-scale interactions, and such decisions will often be counterintuitive. Given the continuing challenges associated with global change, approaches that cross disciplinary boundaries to include interactions and feedbacks at multiple scales are needed to increase our ability to predict catastrophic events and develop strategies for minimizing their occurrence and impacts. Our framework is an important step in developing predictive tools and designing experiments to examine cross-scale interactions.

  18. The relationship between Urbanisation and changes in flood regimes: the British case.

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Ilaria; Miller, James; Kjeldsen, Thomas

    2013-04-01

    This pilot study investigates if long-term changes in observed series of extreme flood events can be attributed to changes in climate and land-use drivers. We investigate, in particular, changes of winter and summer peaks extracted from gauged instantaneous flows records in selected British catchments. Using a Poisson processes framework, the frequency and magnitude of extreme events above a threshold can be modelled simultaneously under the standard stationarity assumptions of constant location and scale. In the case of a non-stationary process, the framework was extended to include covariates to account for changes in the process parameters. By including covariates related to the physical process, such as increased urbanization or North Atlantic Oscillation (NAO) Index levels, rather than just time, an enhanced understanding of the changes in high flows is obtainable. Indeed some variability is expected in any natural process and can be partially explained by large scale measures like NAO Index. The focus of this study is to understand, once natural variability is taken into account, how much of the remaining variability can be explained by increased urbanization levels. For this study, catchments are selected that have experienced significant growth in urbanisation in the past decades, typically 1960s to present, and for which concurrent good quality high flow data are available. Temporal change in the urban extent within catchments is obtained using novel processing of historical mapping sources, whereby the urban, suburban and rural fractions are obtained for decadal periods. Suitable flow data from localised rural catchments are also included as control cases to compare observed changes in the flood regime of urbanised catchments against, and to provide evidence of changes in regional climate. Initial results suggest that the effect of urbanisation can be detected in the rate of occurrence of flood events, especially in summer, whereas the impact on flood magnitude is less pronounced. Further tests across a greater number of catchments are necessary to validate these results.

  19. Multi-Mission Automated Task Invocation Subsystem

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.

    2009-01-01

    Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,

  20. Beyond the SCS-CN method: A theoretical framework for spatially lumped rainfall-runoff response

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-06-01

    Since its introduction in 1954, the Soil Conservation Service curve number (SCS-CN) method has become the standard tool, in practice, for estimating an event-based rainfall-runoff response. However, because of its empirical origins, the SCS-CN method is restricted to certain geographic regions and land use types. Moreover, it does not describe the spatial variability of runoff. To move beyond these limitations, we present a new theoretical framework for spatially lumped, event-based rainfall-runoff modeling. In this framework, we describe the spatially lumped runoff model as a point description of runoff that is upscaled to a watershed area based on probability distributions that are representative of watershed heterogeneities. The framework accommodates different runoff concepts and distributions of heterogeneities, and in doing so, it provides an implicit spatial description of runoff variability. Heterogeneity in storage capacity and soil moisture are the basis for upscaling a point runoff response and linking ecohydrological processes to runoff modeling. For the framework, we consider two different runoff responses for fractions of the watershed area: "prethreshold" and "threshold-excess" runoff. These occur before and after infiltration exceeds a storage capacity threshold. Our application of the framework results in a new model (called SCS-CNx) that extends the SCS-CN method with the prethreshold and threshold-excess runoff mechanisms and an implicit spatial description of runoff. We show proof of concept in four forested watersheds and further that the resulting model may better represent geographic regions and site types that previously have been beyond the scope of the traditional SCS-CN method.

  1. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling

    PubMed Central

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle. PMID:29107976

  2. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    PubMed

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of our proposed adaptive crawler and our stream division/recombination technique provides significant gains in event recall (44.44%) and event precision (9.57%). The addition of these sub-events or pieces, allows us to get closer to solving the event puzzle.

  3. Prejudice, Social Stress, and Mental Health in Lesbian, Gay, and Bisexual Populations: Conceptual Issues and Research Evidence

    PubMed Central

    Meyer, Ilan H.

    2007-01-01

    In this article the author reviews research evidence on the prevalence of mental disorders in lesbians, gay men, and bisexuals (LGBs) and shows, using meta-analyses, that LGBs have a higher prevalence of mental disorders than heterosexuals. The author offers a conceptual framework for understanding this excess in prevalence of disorder in terms of minority stress—explaining that stigma, prejudice, and discrimination create a hostile and stressful social environment that causes mental health problems. The model describes stress processes, including the experience of prejudice events, expectations of rejection, hiding and concealing, internalized homophobia, and ameliorative coping processes. This conceptual framework is the basis for the review of research evidence, suggestions for future research directions, and exploration of public policy implications. PMID:12956539

  4. Geologic framework studies of South Carolina's Long Bay from Little River Inlet to Winyah Bay, 1999-2003: geospatial data release

    USGS Publications Warehouse

    Baldwin, W.E.; Denny, J.F.; Schwab, W.C.; Gayes, P.T.; Morton, R.; Driscoll, N.W.

    2007-01-01

    The northern South Carolina coast is a heavily developed region that supports a thriving tourism industry, large local populations and extensive infrastructure (Figure 1). The economic stability of the region is closely tied to the health of its beaches: primarily in providing support for local tourism and protection from storm events. Despite relatively low long-term shoreline erosion rates, and the implied stability of the beaches, the economic impact of storm events to coastal communities has been costly. For example, Hurricane Hugo made landfall on the central South Carolina coast in 1989. High winds and storm surge inflicted roughly $6 billion in property loss and damages, and Hugo remains the costliest storm event in South Carolina history. Localized erosion, commonly occurring around tidal inlets and erosion "hot spots", has also proved costly. Construction and maintenance of hard structures and beach nourishment, designed to mitigate the effects of erosion, have become annual or multi-annual expenditures. Providing a better understanding of the physical processes controlling coastal erosion and shoreline change will allow for more effective management of coastal resources. In 1999, the U.S. Geological Survey (USGS), in partnership with the South Carolina Sea Grant Consortium (SCSGC), began a study to investigate inner continental shelf and shoreface processes. The objectives of the USGS/SCSGC cooperative program are: 1) to provide a regional synthesis of the shallow geologic framework underlying the shoreface and inner continental shelf, and to define its role in coastal evolution and modern beach behavior; 2) to identify and model the physical processes affecting coastal ocean circulation and sediment transport, and to define their role in shaping the modern shoreline; and 3) to identify sediment sources and transport pathways in order to develop a regional sediment budget. This report contains the geospatial data used to define the geologic framework offshore of the northern South Carolina coast. The digital data presented herein accompany USGS Open-File Reports OFR 2004-1013 and OFR 2005-1345, describing the stratigraphic framework and modern sediment distribution within Long Bay, respectively. Direct on-line links to these publications are available within 'References' on the navigation bar to the left. Additional links to other publications and web sites are also available.

  5. Grounding theories of W(e)Learn: a framework for online interprofessional education.

    PubMed

    Casimiro, Lynn; MacDonald, Colla J; Thompson, Terrie Lynn; Stodel, Emma J

    2009-07-01

    Interprofessional care (IPC) is a prerequisite for enhanced communication between healthcare team members, improved quality of care, and better outcomes for patients. A move to an IPC model requires changing the learning experiences of healthcare providers during and after their qualification program. With the rapid growth of online and blended approaches to learning, an educational framework that explains how to construct quality learning events to provide IPC is pressing. Such a framework would offer a quality standard to help educators design, develop, deliver, and evaluate online interprofessional education (IPE) programs. IPE is an extremely delicate process due to issues related to knowledge, status, power, accountability, personality traits, and culture that surround IPC. In this paper, a review of the pertinent literature that would inform the development of such a framework is presented. The review covers IPC, IPE, learning theories, and eLearning in healthcare.

  6. Data-Driven Information Extraction from Chinese Electronic Medical Records

    PubMed Central

    Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q.

    2015-01-01

    Objective This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Materials and Methods Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. Results The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. Discussion In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). Conclusions The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica. PMID:26295801

  7. Data-Driven Information Extraction from Chinese Electronic Medical Records.

    PubMed

    Xu, Dong; Zhang, Meizhuo; Zhao, Tianwan; Ge, Chen; Gao, Weiguo; Wei, Jia; Zhu, Kenny Q

    2015-01-01

    This study aims to propose a data-driven framework that takes unstructured free text narratives in Chinese Electronic Medical Records (EMRs) as input and converts them into structured time-event-description triples, where the description is either an elaboration or an outcome of the medical event. Our framework uses a hybrid approach. It consists of constructing cross-domain core medical lexica, an unsupervised, iterative algorithm to accrue more accurate terms into the lexica, rules to address Chinese writing conventions and temporal descriptors, and a Support Vector Machine (SVM) algorithm that innovatively utilizes Normalized Google Distance (NGD) to estimate the correlation between medical events and their descriptions. The effectiveness of the framework was demonstrated with a dataset of 24,817 de-identified Chinese EMRs. The cross-domain medical lexica were capable of recognizing terms with an F1-score of 0.896. 98.5% of recorded medical events were linked to temporal descriptors. The NGD SVM description-event matching achieved an F1-score of 0.874. The end-to-end time-event-description extraction of our framework achieved an F1-score of 0.846. In terms of named entity recognition, the proposed framework outperforms state-of-the-art supervised learning algorithms (F1-score: 0.896 vs. 0.886). In event-description association, the NGD SVM is superior to SVM using only local context and semantic features (F1-score: 0.874 vs. 0.838). The framework is data-driven, weakly supervised, and robust against the variations and noises that tend to occur in a large corpus. It addresses Chinese medical writing conventions and variations in writing styles through patterns used for discovering new terms and rules for updating the lexica.

  8. SpaceCubeX: A Framework for Evaluating Hybrid Multi-Core CPU FPGA DSP Architectures

    NASA Technical Reports Server (NTRS)

    Schmidt, Andrew G.; Weisz, Gabriel; French, Matthew; Flatley, Thomas; Villalpando, Carlos Y.

    2017-01-01

    The SpaceCubeX project is motivated by the need for high performance, modular, and scalable on-board processing to help scientists answer critical 21st century questions about global climate change, air quality, ocean health, and ecosystem dynamics, while adding new capabilities such as low-latency data products for extreme event warnings. These goals translate into on-board processing throughput requirements that are on the order of 100-1,000 more than those of previous Earth Science missions for standard processing, compression, storage, and downlink operations. To study possible future architectures to achieve these performance requirements, the SpaceCubeX project provides an evolvable testbed and framework that enables a focused design space exploration of candidate hybrid CPU/FPGA/DSP processing architectures. The framework includes ArchGen, an architecture generator tool populated with candidate architecture components, performance models, and IP cores, that allows an end user to specify the type, number, and connectivity of a hybrid architecture. The framework requires minimal extensions to integrate new processors, such as the anticipated High Performance Spaceflight Computer (HPSC), reducing time to initiate benchmarking by months. To evaluate the framework, we leverage a wide suite of high performance embedded computing benchmarks and Earth science scenarios to ensure robust architecture characterization. We report on our projects Year 1 efforts and demonstrate the capabilities across four simulation testbed models, a baseline SpaceCube 2.0 system, a dual ARM A9 processor system, a hybrid quad ARM A53 and FPGA system, and a hybrid quad ARM A53 and DSP system.

  9. Research and Evaluations of the Health Aspects of Disasters, Part VII: The Relief/Recovery Framework.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P

    2016-04-01

    The principal goal of research relative to disasters is to decrease the risk that a hazard will result in a disaster. Disaster studies pursue two distinct directions: (1) epidemiological (non-interventional); and (2) interventional. Both interventional and non-interventional studies require data/information obtained from assessments of function. Non-interventional studies examine the epidemiology of disasters. Interventional studies evaluate specific interventions/responses in terms of their effectiveness in meeting their respective objectives, their contribution to the overarching goal, other effects created, their respective costs, and the efficiency with which they achieved their objectives. The results of interventional studies should contribute to evidence that will be used to inform the decisions used to define standards of care and best practices for a given setting based on these standards. Interventional studies are based on the Disaster Logic Model (DLM) and are used to change or maintain levels of function (LOFs). Relief and Recovery interventional studies seek to determine the effects, outcomes, impacts, costs, and value of the intervention provided after the onset of a damaging event. The Relief/Recovery Framework provides the structure needed to systematically study the processes involved in providing relief or recovery interventions that result in a new LOF for a given Societal System and/or its component functions. It consists of the following transformational processes (steps): (1) identification of the functional state prior to the onset of the event (pre-event); (2) assessments of the current functional state; (3) comparison of the current functional state with the pre-event state and with the results of the last assessment; (4) needs identification; (5) strategic planning, including establishing the overall strategic goal(s), objectives, and priorities for interventions; (6) identification of options for interventions; (7) selection of the most appropriate intervention(s); (8) operational planning; (9) implementation of the intervention(s); (10) assessments of the effects and changes in LOFs resulting from the intervention(s); (11) determination of the costs of providing the intervention; (12) determination of the current functional status; (13) synthesis of the findings with current evidence to define the benefits and value of the intervention to the affected population; and (14) codification of the findings into new evidence. Each of these steps in the Framework is a production function that facilitates evaluation, and the outputs of the transformation process establish the current state for the next step in the process. The evidence obtained is integrated into augmenting the respective Response Capacities of a community-at-risk. The ultimate impact of enhanced Response Capacity is determined by studying the epidemiology of the next event.

  10. Building the framework for climate change adaptation in the urban areas using participatory approach: the Czech Republic experience

    NASA Astrophysics Data System (ADS)

    Emmer, Adam; Hubatová, Marie; Lupač, Miroslav; Pondělíček, Michael; Šafařík, Miroslav; Šilhánková, Vladimíra; Vačkář, David

    2016-04-01

    The Czech Republic has experienced numerous extreme hydrometeorological / climatological events such as floods (significant ones in 1997, 2002, 2010, 2013), droughts (2013, 2015), heat waves (2015) and windstorms (2007) during past decades. These events are generally attributed to the ongoing climate change and caused loss of lives and significant material damages (up to several % of GDP in some years), especially in urban areas. To initiate the adaptation process of urban areas, the main objective was to prepare a framework for creating climate change adaptation strategies of individual cities reflecting physical-geographical and socioeconomical conditions of the Czech Republic. Three pilot cities (Hradec Králové, Žďár nad Sázavou, Dobru\\vska) were used to optimize entire procedure. Two sets of participatory seminars were organised in order to involve all key stakeholders (the city council, department of the environment, department of the crisis management, hydrometeorological institute, local experts, ...) into the process of creation of the adaptation strategy from its early stage. Lesson learned for the framework were related especially to its applicability on a local level, which is largely a matter of the understandability of the concept. Finally, this illustrative and widely applicable framework (so called 'road map to adaptation strategy') includes five steps: (i) analysis of existing strategies and plans on national, regional and local levels; (ii) analysing climate-change related hazards and key vulnerabilities; (iii) identification of adaptation needs, evaluation of existing adaptation capacity and formulation of future adaptation priorities; (iv) identification of limits and barriers for the adaptation (economical, environmental, ...); and (v) selection of specific types of adaptation measures reflecting identified adaptation needs and formulated adaptation priorities. Keywords: climate change adaptation (CCA); urban areas; participatory approach; road map

  11. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  12. The Event Chain of Survival in the Context of Music Festivals: A Framework for Improving Outcomes at Major Planned Events.

    PubMed

    Lund, Adam; Turris, Sheila

    2017-08-01

    Despite the best efforts of event producers and on-site medical teams, there are sometimes serious illnesses, life-threatening injuries, and fatalities related to music festival attendance. Producers, clinicians, and researchers are actively seeking ways to reduce the mortality and morbidity associated with these events. After analyzing the available literature on music festival health and safety, several major themes emerged. Principally, stakeholder groups planning in isolation from one another (ie, in silos) create fragmentation, gaps, and overlap in plans for major planned events (MPEs). The authors hypothesized that one approach to minimizing this fragmentation may be to create a framework to "connect the dots," or join together the many silos of professionals responsible for safety, security, health, and emergency planning at MPEs. Adapted from the well-established literature regarding the management of cardiac arrests, both in and out of hospital, the "chain of survival" concept is applied to the disparate groups providing services that support event safety in the context of music festivals. The authors propose this framework for describing, understanding, coordinating and planning around the integration of safety, security, health, and emergency service for events. The adapted Event Chain of Survival contains six interdependent links, including: (1) event producers; (2) police and security; (3) festival health; (4) on-site medical services; (5) ambulance services; and (6) off-site medical services. The authors argue that adapting and applying this framework in the context of MPEs in general, and music festivals specifically, has the potential to break down the current disconnected approach to event safety, security, health, and emergency planning. It offers a means of shifting the focus from a purely reactive stance to a more proactive, collaborative, and integrated approach. Improving health outcomes for music festival attendees, reducing gaps in planning, promoting consistency, and improving efficiency by reducing duplication of services will ultimately require coordination and collaboration from the beginning of event production to post-event reporting. Lund A , Turris SA . The Event Chain of Survival in the context of music festivals: a framework for improving outcomes at major planned events. Prehosp Disaster Med. 2017;32(4):437-443.

  13. Distinguishing How from Why the Mind Wanders: A Process-Occurrence Framework for Self-Generated Mental Activity

    ERIC Educational Resources Information Center

    Smallwood, Jonathan

    2013-01-01

    Cognition can unfold with little regard to the events taking place in the environment, and such self-generated mental activity poses a specific set of challenges for its scientific analysis in both cognitive science and neuroscience. One problem is that the spontaneous onset of self-generated mental activity makes it hard to distinguish the events…

  14. Psychophysical correlations, synchronicity and meaning.

    PubMed

    Atmanspacher, Harald

    2014-04-01

    The dual-aspect framework which Jung developed with Wolfgang Pauli implies that psychophysical phenomena are neither reducible to physical processes nor to conscious mental activity. Rather, they constitute a radically novel kind of phenomena, deriving from correlations between the physical and the mental. In synchronistic events, a particular subclass of psychophysical phenomena, these correlations are explicated as experienced meaning. © 2014, The Society of Analytical Psychology.

  15. Sudden Event Recognition: A Survey

    PubMed Central

    Suriani, Nor Surayahani; Hussain, Aini; Zulkifley, Mohd Asyraf

    2013-01-01

    Event recognition is one of the most active research areas in video surveillance fields. Advancement in event recognition systems mainly aims to provide convenience, safety and an efficient lifestyle for humanity. A precise, accurate and robust approach is necessary to enable event recognition systems to respond to sudden changes in various uncontrolled environments, such as the case of an emergency, physical threat and a fire or bomb alert. The performance of sudden event recognition systems depends heavily on the accuracy of low level processing, like detection, recognition, tracking and machine learning algorithms. This survey aims to detect and characterize a sudden event, which is a subset of an abnormal event in several video surveillance applications. This paper discusses the following in detail: (1) the importance of a sudden event over a general anomalous event; (2) frameworks used in sudden event recognition; (3) the requirements and comparative studies of a sudden event recognition system and (4) various decision-making approaches for sudden event recognition. The advantages and drawbacks of using 3D images from multiple cameras for real-time application are also discussed. The paper concludes with suggestions for future research directions in sudden event recognition. PMID:23921828

  16. Using the concept of Shannon's Entropy to evaluate impacts of climate extremes on interannual variability in ecosystem CO2 fluxes

    NASA Astrophysics Data System (ADS)

    Ma, S.; Baldocchi, D. D.

    2016-12-01

    Although interannual variability in ecosystem CO2 fluxes have been observed in the field and described with empirical or process-based models, we still lack tools for evaluating and comparing impacts of climate extremes or unusual biogeophysical events on the variability. We examined a 15-year-long dataset of net ecosystem exchange of CO2 (NEE) measured at a woody savanna and a grassland site in California from 2000 to 2015. We proposed a conceptual framework to quantify season contributions by computing relatively contributions of each season to annual anomalies of gross ecosystem productivity (GPP) and ecosystem respiration (Reco). According to the framework, we calculated the Shannon's Entropy for each year. The values of Shannon Entropy were higher in the year that variations in GPP and Reco were beyond predictions of empirical models established for the study site. We specifically examined the outliers compared to model predictions and concluded that the outliers were related to occurrences of unexpected biogeophysical events in those years. This study offers a new application of Shannon's Entropy in understanding complicated biophysical and ecological processes involved in ecosystem carbon cycling.

  17. Matching next-to-leading order predictions to parton showers in supersymmetric QCD

    DOE PAGES

    Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; ...

    2016-02-03

    We present a fully automated framework based on the FeynRules and MadGraph5_aMC@NLO programs that allows for accurate simulations of supersymmetric QCD processes at the LHC. Starting directly from a model Lagrangian that features squark and gluino interactions, event generation is achieved at the next-to-leading order in QCD, matching short-distance events to parton showers and including the subsequent decay of the produced supersymmetric particles. As an application, we study the impact of higher-order corrections in gluino pair-production in a simplified benchmark scenario inspired by current gluino LHC searches.

  18. Matching next-to-leading order predictions to parton showers in supersymmetric QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin

    We present a fully automated framework based on the FeynRules and MadGraph5_aMC@NLO programs that allows for accurate simulations of supersymmetric QCD processes at the LHC. Starting directly from a model Lagrangian that features squark and gluino interactions, event generation is achieved at the next-to-leading order in QCD, matching short-distance events to parton showers and including the subsequent decay of the produced supersymmetric particles. As an application, we study the impact of higher-order corrections in gluino pair-production in a simplified benchmark scenario inspired by current gluino LHC searches.

  19. The fundamental theorem of asset pricing under default and collateral in finite discrete time

    NASA Astrophysics Data System (ADS)

    Alvarez-Samaniego, Borys; Orrillo, Jaime

    2006-08-01

    We consider a financial market where time and uncertainty are modeled by a finite event-tree. The event-tree has a length of N, a unique initial node at the initial date, and a continuum of branches at each node of the tree. Prices and returns of J assets are modeled, respectively, by a R2JxR2J-valued stochastic process . In this framework we prove a version of the Fundamental Theorem of Asset Pricing which applies to defaultable securities backed by exogenous collateral suffering a contingent linear depreciation.

  20. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  1. A trajectory generation framework for modeling spacecraft entry in MDAO

    NASA Astrophysics Data System (ADS)

    D`Souza, Sarah N.; Sarigul-Klijn, Nesrin

    2016-04-01

    In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.

  2. Social media and disasters: a functional framework for social media use in disaster planning, response, and research.

    PubMed

    Houston, J Brian; Hawthorne, Joshua; Perreault, Mildred F; Park, Eun Hae; Goldstein Hode, Marlo; Halliwell, Michael R; Turner McGowen, Sarah E; Davis, Rachel; Vaid, Shivani; McElderry, Jonathan A; Griffith, Stanford A

    2015-01-01

    A comprehensive review of online, official, and scientific literature was carried out in 2012-13 to develop a framework of disaster social media. This framework can be used to facilitate the creation of disaster social media tools, the formulation of disaster social media implementation processes, and the scientific study of disaster social media effects. Disaster social media users in the framework include communities, government, individuals, organisations, and media outlets. Fifteen distinct disaster social media uses were identified, ranging from preparing and receiving disaster preparedness information and warnings and signalling and detecting disasters prior to an event to (re)connecting community members following a disaster. The framework illustrates that a variety of entities may utilise and produce disaster social media content. Consequently, disaster social media use can be conceptualised as occurring at a number of levels, even within the same disaster. Suggestions are provided on how the proposed framework can inform future disaster social media development and research. © 2014 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  3. The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment.

    PubMed

    Brockmeier, Erica K; Hodges, Geoff; Hutchinson, Thomas H; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F; Perkins, Edward J; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco

    2017-08-01

    In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology.

  4. The Role of Omics in the Application of Adverse Outcome Pathways for Chemical Risk Assessment

    PubMed Central

    Brockmeier, Erica K.; Hodges, Geoff; Hutchinson, Thomas H.; Butler, Emma; Hecker, Markus; Tollefsen, Knut Erik; Garcia-Reyero, Natalia; Kille, Peter; Becker, Dörthe; Chipman, Kevin; Colbourne, John; Collette, Timothy W.; Cossins, Andrew; Cronin, Mark; Graystock, Peter; Gutsell, Steve; Knapen, Dries; Katsiadaki, Ioanna; Lange, Anke; Marshall, Stuart; Owen, Stewart F.; Perkins, Edward J.; Plaistow, Stewart; Schroeder, Anthony; Taylor, Daisy; Viant, Mark; Ankley, Gerald; Falciani, Francesco

    2017-01-01

    Abstract In conjunction with the second International Environmental Omics Symposium (iEOS) conference, held at the University of Liverpool (United Kingdom) in September 2014, a workshop was held to bring together experts in toxicology and regulatory science from academia, government and industry. The purpose of the workshop was to review the specific roles that high-content omics datasets (eg, transcriptomics, metabolomics, lipidomics, and proteomics) can hold within the adverse outcome pathway (AOP) framework for supporting ecological and human health risk assessments. In light of the growing number of examples of the application of omics data in the context of ecological risk assessment, we considered how omics datasets might continue to support the AOP framework. In particular, the role of omics in identifying potential AOP molecular initiating events and providing supportive evidence of key events at different levels of biological organization and across taxonomic groups was discussed. Areas with potential for short and medium-term breakthroughs were also discussed, such as providing mechanistic evidence to support chemical read-across, providing weight of evidence information for mode of action assignment, understanding biological networks, and developing robust extrapolations of species-sensitivity. Key challenges that need to be addressed were considered, including the need for a cohesive approach towards experimental design, the lack of a mutually agreed framework to quantitatively link genes and pathways to key events, and the need for better interpretation of chemically induced changes at the molecular level. This article was developed to provide an overview of ecological risk assessment process and a perspective on how high content molecular-level datasets can support the future of assessment procedures through the AOP framework. PMID:28525648

  5. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  6. Trial Implementation of a Multihazard Risk Assessment Framework for High-Impact Low-Frequency Power Grid Events

    DOE PAGES

    Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...

    2017-08-25

    The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less

  7. What makes a thriver? Unifying the concepts of posttraumatic and postecstatic growth

    PubMed Central

    Mangelsdorf, Judith; Eid, Michael

    2015-01-01

    The thriver model is a novel framework that unifies the concepts of posttraumatic and postecstatic growth. According to the model, it is not the quality of an event, but the way it is processed, that is critical for the occurrence of post-event growth. The model proposes that meaning making, supportive relationships, and positive emotions facilitate growth processes after positive as well as traumatic experiences. The tenability of these propositions was investigated in two dissimilar cultures. In Study 1, participants from the USA (n = 555) and India (n = 599) answered an extended version of the Social Readjustment Rating Scale to rank the socioemotional impact of events. Results indicate that negative events are perceived as more impactful than positive ones in the USA, whereas the reverse is true in India. In Study 2, participants from the USA (n = 342) and India (n = 341) answered questions about the thriver model's main components. Results showed that posttraumatic and postecstatic growth are highly interrelated. All elements of the thriver model were key variables for the prediction of growth. Supportive relationships and positive emotions had a direct effect on growth, while meaning making mediated the direct effect of major life events. PMID:26157399

  8. Critical role of wind-wave induced erosion on the morphodynamic evolution of shallow tidal basins

    NASA Astrophysics Data System (ADS)

    D'Alpaos, Andrea; Carniello, Luca; Rinaldo, Andrea

    2014-05-01

    Wind-wave induced erosion processes are among the chief processes which govern the morphodynamic evolution of shallow tidal basins, both in the vertical and in the horizontal plane. Wind-wave induced bottom shear stresses can promote the disruption of the polymeric microphytobenthic biofilm and lead to the erosion of tidal-flat surfaces and to the increase in suspended sediment concentration which affects the stability of intertidal ecosystems. Moreover, the impact of wind-waves on salt-marsh margins can lead to the lateral erosion of marsh boundaries thus promoting the disappearance of salt-marsh ecosystems. Towards the goal of developing a synthetic theoretical framework to represent wind wave-induced resuspension events and account for their erosional effects on the long-term biomorphodynamic evolution of tidal systems, we have employed a complete, coupled finite element model accounting for the role of wind waves and tidal currents on the hydrodynamic circulation in shallow basins. Our analyses of the characteristics of combined current and wave-induced exceedances in bottom shear stress over a given threshold for erosion, suggest that wind wave-induced resuspension events can be modeled as a marked Poisson process. The interarrival time of wave-induced erosion events is, in fact, an exponentially distributed random variable, as well as the duration and intensity of overthreshold events. Moreover, the analysis of wind-wave induced resuspension events for different historical configurations of the Venice Lagoon from the 19th to the 21st century, shows that the interarrival times of erosion events have dramatically decreased through the last two centuries, whereas the intensities of erosion events have experienced a surprisingly high increase. This allows us to characterize the threatening erosion and degradation processes that the Venice Lagoon has been experiencing since the beginning of the last century.

  9. What psychological process is reflected in the FN400 event-related potential component?

    PubMed

    Leynes, P Andrew; Bruett, Heather; Krizan, Jenna; Veloso, Ana

    2017-04-01

    During many recognition contexts, old items elicit a more positive event-related potentials (ERPs) than new items at mid-frontal electrodes about 300-500ms. The psychological processes that are reflected in is ERP component (i.e., the FN400) has been vigorously debated. Some argue that the FN400 reflects familiarity, whereas others argue that it reflects conceptual implicit memory. Three experiments contrasted these two hypotheses by presenting pre-experimentally familiar (i.e., name-brand) products and novel, off-brand products. In addition, some of the off-brand products were conceptually primed by the name-brand product to determine how FN400 amplitude would be affected by conceptually primed, but novel, products. The results provided mixed support for both theoretical views, and it is integrated with a broader theoretical framework to characterize the psychological processes captured by the FN400. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Virtual Sensor Web Architecture

    NASA Astrophysics Data System (ADS)

    Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.

    2006-12-01

    NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.

  11. Fracture behavior of metal-ceramic fixed dental prostheses with frameworks from cast or a newly developed sintered cobalt-chromium alloy.

    PubMed

    Krug, Klaus-Peter; Knauber, Andreas W; Nothdurft, Frank P

    2015-03-01

    The aim of this study was to investigate the fracture behavior of metal-ceramic bridges with frameworks from cobalt-chromium-molybdenum (CoCrMo), which are manufactured using conventional casting or a new computer-aided design/computer-aided manufacturing (CAD/CAM) milling and sintering technique. A total of 32 metal-ceramic fixed dental prostheses (FDPs), which are based on a nonprecious metal framework, was produced using a conventional casting process (n = 16) or a new CAD/CAM milling and sintering process (n = 16). Eight unveneered frameworks were manufactured using each of the techniques. After thermal and mechanical aging of half of the restorations, all samples were subjected to a static loading test in a universal testing machine, in which acoustic emission monitoring was performed. Three different critical forces were revealed: the fracture force (F max), the force at the first reduction in force (F decr1), and the force at the critical acoustic event (F acoust1). With the exception of the veneered restorations with cast or sintered metal frameworks without artificial aging, which presented a statistically significant but slightly different F max, no statistically significant differences between cast and CAD/CAM sintered and milled FDPs were detected. Thermal and mechanical loading did not significantly affect the resulting forces. Cast and CAD/CAM milled and sintered metal-ceramic bridges were determined to be comparable with respect to the fracture behavior. FDPs based on CAD/CAM milled and sintered frameworks may be an applicable and less technique-sensitive alternative to frameworks that are based on conventionally cast frameworks.

  12. A Strategic Framework for Responding to Coral Bleaching Events in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Maynard, J. A.; Johnson, J. E.; Marshall, P. A.; Eakin, C. M.; Goby, G.; Schuttenberg, H.; Spillman, C. M.

    2009-07-01

    The frequency and severity of mass coral bleaching events are predicted to increase as sea temperatures continue to warm under a global regime of rising ocean temperatures. Bleaching events can be disastrous for coral reef ecosystems and, given the number of other stressors to reefs that result from human activities, there is widespread concern about their future. This article provides a strategic framework from the Great Barrier Reef to prepare for and respond to mass bleaching events. The framework presented has two main inter-related components: an early warning system and assessment and monitoring. Both include the need to proactively and consistently communicate information on environmental conditions and the level of bleaching severity to senior decision-makers, stakeholders, and the public. Managers, being the most timely and credible source of information on bleaching events, can facilitate the implementation of strategies that can give reefs the best chance to recover from bleaching and to withstand future disturbances. The proposed framework is readily transferable to other coral reef regions, and can easily be adapted by managers to local financial, technical, and human resources.

  13. Development of a GIS-based integrated framework for coastal seiches monitoring and forecasting: A North Jiangsu shoal case study

    NASA Astrophysics Data System (ADS)

    Qin, Rufu; Lin, Liangzhao

    2017-06-01

    Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.

  14. Progresses with Net-VISA on Global Infrasound Association

    NASA Astrophysics Data System (ADS)

    Mialle, Pierrick; Arora, Nimar

    2017-04-01

    Global Infrasound Association algorithms are an important area of active development at the International Data Centre (IDC). These algorithms play an important part of the automatic processing system for verification technologies. A key focus at the IDC is to enhance association and signal characterization methods by incorporating the identification of signals of interest and the optimization of the network detection threshold. The overall objective is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the Reviewed Event Bulletins (REB), and hence reduce IDC analyst workload. Despite good accuracy by the IDC categorization, a number of signal detections due to clutter sources such as microbaroms or surf are built into events. In this work we aim to optimize the association criteria based on knowledge acquired by IDC in the last 6 years, and focus on the specificity of seismo-acoustic events. The resulting work has been incorporated into NETVISA [1], a Bayesian approach to network processing. The model that we propose is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  15. Progresses with Net-VISA on Global Infrasound Association

    NASA Astrophysics Data System (ADS)

    Mialle, P.; Arora, N. S.

    2016-12-01

    Global Infrasound Association algorithms are an important area of active development at the International Data Centre (IDC). These algorithms play an important part of the automatic processing system for verification technologies. A key focus at the IDC is to enhance association and signal characterization methods by incorporating the identification of signals of interest and the optimization of the network detection threshold. The overall objective is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the Reviewed Event Bulletins (REB), and hence reduce IDC analyst workload. Despite good accuracy by the IDC categorization, a number of signal detections due to clutter sources such as microbaroms or surf are built into events. In this work we aim to optimize the association criteria based on knowledge acquired by IDC in the last 6 years, and focus on the specificity of seismo-acoustic events. The resulting work has been incorporated into NETVISA [1], a Bayesian approach to network processing. The model that we propose is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  16. Assessing the quality of a deliberative democracy mini-public event about advanced biofuel production and development in Canada.

    PubMed

    Longstaff, Holly; Secko, David M

    2016-02-01

    The importance of evaluating deliberative public engagement events is well recognized, but such activities are rarely conducted for a variety of theoretical, political and practical reasons. In this article, we provide an assessment of the criteria presented in the 2008 National Research Council report on Public Participation in Environmental Assessment and Decision Making (NRC report) as explicit indicators of quality for the 2012 'Advanced Biofuels' deliberative democracy event. The National Research Council's criteria were selected to evaluate this event because they are decision oriented, are the products of an exhaustive review of similar past events, are intended specifically for environmental processes and encompass many of the criteria presented in other evaluation frameworks. It is our hope that the results of our study may encourage others to employ and assess the National Research Council's criteria as a generalizable benchmark that may justifiably be used in forthcoming deliberative events exploring different topics with different audiences. © The Author(s) 2014.

  17. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    PubMed

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Anxiety symptoms mediate the relationship between exposure to stressful negative life events and depressive symptoms: A conditional process modelling of the protective effects of resilience.

    PubMed

    Anyan, Frederick; Worsley, Lyn; Hjemdal, Odin

    2017-10-01

    Resilience has provided a useful framework that elucidates the effects of protective factors to overcome psychological adversities but studies that address the potential contingencies of resilience to protect against direct and indirect negative effects are lacking. These obvious gaps have also resulted in oversimplification of complex processes that can be clarified by moderated mediation associations. This study examines a conditional process modelling of the protective effects of resilience against indirect effects. Two separate samples were recruited in a cross-sectional survey from Australia and Norway to complete the Patient Health Questionnaire -9, Generalized Anxiety Disorder, Stressful Negative Life Events Questionnaire and the Resilience Scale for Adults. The final sample sizes were 206 (females=114; males=91; other=1) and 210 (females=155; males=55) for Australia and Norway respectively. Moderated mediation analyses were conducted across the samples. Anxiety symptoms mediated the relationship between exposure to stressful negative life events and depressive symptoms in both samples. Conditional indirect effects of exposure to stressful negative life events on depressive symptoms mediated by anxiety symptoms showed that high subgroup of resilience was associated with less effect of exposure to stressful negative life events through anxiety symptoms on depressive symptoms than the low subgroup of resilience. As a cross-sectional survey, the present study does not answer questions about causal processes despite the use of a conditional process modelling. These findings support that, resilience protective resources can protect against both direct and indirect - through other channels - psychological adversities. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A visual analytic framework for data fusion in investigative intelligence

    NASA Astrophysics Data System (ADS)

    Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David

    2014-05-01

    Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.

  20. Real-time threat assessment for critical infrastructure protection: data incest and conflict in evidential reasoning

    NASA Astrophysics Data System (ADS)

    Brandon, R.; Page, S.; Varndell, J.

    2012-06-01

    This paper presents a novel application of Evidential Reasoning to Threat Assessment for critical infrastructure protection. A fusion algorithm based on the PCR5 Dezert-Smarandache fusion rule is proposed which fuses alerts generated by a vision-based behaviour analysis algorithm and a-priori watch-list intelligence data. The fusion algorithm produces a prioritised event list according to a user-defined set of event-type severity or priority weightings. Results generated from application of the algorithm to real data and Behaviour Analysis alerts captured at London's Heathrow Airport under the EU FP7 SAMURAI programme are presented. A web-based demonstrator system is also described which implements the fusion process in real-time. It is shown that this system significantly reduces the data deluge problem, and directs the user's attention to the most pertinent alerts, enhancing their Situational Awareness (SA). The end-user is also able to alter the perceived importance of different event types in real-time, allowing the system to adapt rapidly to changes in priorities as the situation evolves. One of the key challenges associated with fusing information deriving from intelligence data is the issue of Data Incest. Techniques for handling Data Incest within Evidential Reasoning frameworks are proposed, and comparisons are drawn with respect to Data Incest management techniques that are commonly employed within Bayesian fusion frameworks (e.g. Covariance Intersection). The challenges associated with simultaneously dealing with conflicting information and Data Incest in Evidential Reasoning frameworks are also discussed.

  1. An Urban Resilience to Extreme Weather Events Framework for Development of Post Event Learning and Transformative Adaptation in Cities

    NASA Astrophysics Data System (ADS)

    Solecki, W. D.; Friedman, E. S.; Breitzer, R.

    2016-12-01

    Increasingly frequent extreme weather events are becoming an immediate priority for urban coastal practitioners and stakeholders, adding complexity to decisions concerning risk management for short-term action and long-term needs of city climate stakeholders. The conflict between the prioritization of short versus long-term events by decision-makers creates disconnect between climate science and its applications. The Consortium for Climate Risk in the Urban Northeast (CCRUN), a NOAA RISA team, is developing a set of mechanisms to help bridge this gap. The mechanisms are designed to promote the application of climate science on extreme weather events and their aftermath. It is in the post event policy window where significant opportunities for science-policy linkages exist. In particular, CCRUN is interested in producing actionable and useful information for city managers to use in decision-making processes surrounding extreme weather events and climate change. These processes include a sector specific needs assessment survey instrument and two tools for urban coastal practitioners and stakeholders. The tools focus on post event learning and connections between resilience and transformative adaptation. Elements of the two tools are presented. Post extreme event learning supports urban coastal practitioners and decision-makers concerned about maximizing opportunities for knowledge transfer and assimilation, and policy initiation and development following an extreme weather event. For the urban U.S. Northeast, post event learning helps coastal stakeholders build the capacity to adapt to extreme weather events, and inform and develop their planning capacity through analysis of past actions and steps taken in response to Hurricane Sandy. Connecting resilience with transformative adaptation is intended to promote resilience in urban Northeast coastal settings to the long-term negative consequences of extreme weather events. This is done through a knowledge co-production engagement process that links innovative and flexible adaptation pathways that can address requirements for short-term action and long-term needs.

  2. Enhancing the usability of seasonal to decadal (S2D) climate information - an evidence-based framework for the identification and assessment of sector-specific vulnerabilities

    NASA Astrophysics Data System (ADS)

    Funk, Daniel

    2016-04-01

    The successful provision of from seasonal to decadal (S2D) climate service products to sector-specific users is dependent on specific problem characteristics and individual user needs and decision-making processes. Climate information requires an impact on decision making to have any value (Rodwell and Doblas-Reyes, 2006). For that reason the knowledge of sector-specific vulnerabilities to S2D climate variability is very valuable information for both, climate service producers and users. In this context a concept for a vulnerability assessment framework was developed to (i) identify climate events (and especially their temporal scales) critical for sector-specific problems to assess the basic requirements for an appropriate climate-service product development; and to (ii) assess the potential impact or value of related climate information for decision-makers. The concept was developed within the EUPORIAS project (European Provision of Regional Impacts Assessments on Seasonal and Decadal Timescales) based on ten project-related case-studies from different sectors all over Europe. In the prevalent stage the framework may be useful as preliminary assessment or 'quick-scan' of the vulnerability of specific systems to climate variability in the context of S2D climate service provision. The assessment strategy of the framework is user-focused, using predominantly a bottom-up approach (vulnerability as state) but also a top-down approach (vulnerability as outcome) generally based on qualitative data (surveys, interviews, etc.) and literature research for system understanding. The starting point of analysis is a climate-sensitive 'critical situation' of the considered system which requires a decision and is defined by the user. From this basis the related 'critical climate conditions' are assessed and 'climate information needs' are derived. This mainly refers to the critical period of time of the climate event or sequence of events. The relevant period of time of problem-specific critical climate conditions may be assessed by the resilience of the system of concern, the response time of an interconnected system (i.e. top-down approach using a bottom-up methodology) or alternatively, by the critical time-frame of decision-making processes (bottom-up approach). This approach counters the challenges for a vulnerability assessment of economic sectors to S2D climate events which originate from the inherent role of climate for economic sectors: climate may affect economic sectors as hazard, resource, production- or regulation factor. This implies, that climate dependencies are often indirect and nonlinear. Consequently, climate events which are critical for affected systems do not necessarily correlate with common climatological extremes. One important output of the framework is a classification system of 'climate-impact types' which classifies sector-specific problems in a systemic way. This system proves to be promising because (i) it reflects and thus differentiates the cause for the climate relevance of a specific problem (compositions of buffer factors); (ii) it integrates decision-making processes which proved to be a significant factor; (iii) it indicates a potential usability of S2D climate service products and thus integrates coping options, and (vi) it is a systemic approach which goes beyond the established 'snap-shot' of vulnerability assessments.

  3. Statistical characterization of wind-wave induced sediment resuspension events in shallow tidal basins

    NASA Astrophysics Data System (ADS)

    D'Alpaos, A.; Carniello, L.; Rinaldo, A.

    2013-12-01

    Wind-wave induced erosion processes play a critical role on the morphodynamic evolution of shallow tidal landscapes. Both in the horizontal and in the vertical planes, patterns of wind-induced bottom shear stresses contribute to control the morphological and biological features of the tidal landscape, through the erosion of tidal-flat surfaces and of salt-marsh margins, the disruption of the polymeric microphytobenthic biofilm, and the increase in suspended sediment concentration which affects the stability of intertidal ecosystems. Towards the goal of developing a synthetic theoretical framework to represent wind wave-induced resuspension events and account for their erosional effects on the long-term biomorphodynamic evolution of tidal systems, we have employed a complete, coupled finite element model accounting for the role of wind waves and tidal currents on the hydrodynamic circulation in shallow basins. Our analysis of the characteristics of combined current and wave-induced exceedances in bottom shear stress over a given threshold for erosion, suggest that wind wave-induced resuspension events can be modeled as a marked Poisson process. Moreover, the analysis of wind-wave induced resuspension events for different historical configurations of the Venice Lagoon shows that the interarrival times of erosion events have decreased through the last two centuries, whereas the intensities of erosion events have increased. This allows us to characterize the threatening erosion and degradation processes that the Venice Lagoon has been experiencing since the beginning of the last century.

  4. SU-E-T-191: PITSTOP: Process Improvement Techniques, Software Tools, and Operating Principles for a Quality Initiative Discovery Framework.

    PubMed

    Siochi, R

    2012-06-01

    To develop a quality initiative discovery framework using process improvement techniques, software tools and operating principles. Process deviations are entered into a radiotherapy incident reporting database. Supervisors use an in-house Event Analysis System (EASy) to discuss incidents with staff. Major incidents are analyzed with an in-house Fault Tree Analysis (FTA). A meta-Analysis is performed using association, text mining, key word clustering, and differential frequency analysis. A key operating principle encourages the creation of forcing functions via rapid application development. 504 events have been logged this past year. The results for the key word analysis indicate that the root cause for the top ranked key words was miscommunication. This was also the root cause found from association analysis, where 24% of the time that an event involved a physician it also involved a nurse. Differential frequency analysis revealed that sharp peaks at week 27 were followed by 3 major incidents, two of which were dose related. The peak was largely due to the front desk which caused distractions in other areas. The analysis led to many PI projects but there is still a major systematic issue with the use of forms. The solution we identified is to implement Smart Forms to perform error checking and interlocking. Our first initiative replaced our daily QA checklist with a form that uses custom validation routines, preventing therapists from proceeding with treatments until out of tolerance conditions are corrected. PITSTOP has increased the number of quality initiatives in our department, and we have discovered or confirmed common underlying causes of a variety of seemingly unrelated errors. It has motivated the replacement of all forms with smart forms. © 2012 American Association of Physicists in Medicine.

  5. A new statistical time-dependent model of earthquake occurrence: failure processes driven by a self-correcting model

    NASA Astrophysics Data System (ADS)

    Rotondi, Renata; Varini, Elisa

    2016-04-01

    The long-term recurrence of strong earthquakes is often modelled by the stationary Poisson process for the sake of simplicity, although renewal and self-correcting point processes (with non-decreasing hazard functions) are more appropriate. Short-term models mainly fit earthquake clusters due to the tendency of an earthquake to trigger other earthquakes; in this case, self-exciting point processes with non-increasing hazard are especially suitable. In order to provide a unified framework for analyzing earthquake catalogs, Schoenberg and Bolt proposed the SELC (Short-term Exciting Long-term Correcting) model (BSSA, 2000) and Varini employed a state-space model for estimating the different phases of a seismic cycle (PhD Thesis, 2005). Both attempts are combinations of long- and short-term models, but results are not completely satisfactory, due to the different scales at which these models appear to operate. In this study, we split a seismic sequence in two groups: the leader events, whose magnitude exceeds a threshold magnitude, and the remaining ones considered as subordinate events. The leader events are assumed to follow a well-known self-correcting point process named stress release model (Vere-Jones, J. Phys. Earth, 1978; Bebbington & Harte, GJI, 2003, Varini & Rotondi, Env. Ecol. Stat., 2015). In the interval between two subsequent leader events, subordinate events are expected to cluster at the beginning (aftershocks) and at the end (foreshocks) of that interval; hence, they are modeled by a failure processes that allows bathtub-shaped hazard function. In particular, we have examined the generalized Weibull distributions, a large family that contains distributions with different bathtub-shaped hazard as well as the standard Weibull distribution (Lai, Springer, 2014). The model is fitted to a dataset of Italian historical earthquakes and the results of Bayesian inference are shown.

  6. A cyber-event correlation framework and metrics

    NASA Astrophysics Data System (ADS)

    Kang, Myong H.; Mayfield, Terry

    2003-08-01

    In this paper, we propose a cyber-event fusion, correlation, and situation assessment framework that, when instantiated, will allow cyber defenders to better understand the local, regional, and global cyber-situation. This framework, with associated metrics, can be used to guide assessment of our existing cyber-defense capabilities, and to help evaluate the state of cyber-event correlation research and where we must focus our future cyber-event correlation research. The framework, based on the cyber-event gathering activities and analysis functions, consists of five operational steps, each of which provides a richer set of contextual information to support greater situational understanding. The first three steps are categorically depicted as increasingly richer and broader-scoped contexts achieved through correlation activity, while in the final two steps, these richer contexts are achieved through analytical activities (situation assessment, and threat analysis & prediction). Category 1 Correlation focuses on the detection of suspicious activities and the correlation of events from a single cyber-event source. Category 2 Correlation clusters the same or similar events from multiple detectors that are located at close proximity and prioritizes them. Finally, the events from different time periods and event sources at different location/regions are correlated at Category 3 to recognize the relationship among different events. This is the category that focuses on the detection of large-scale and coordinated attacks. The situation assessment step (Category 4) focuses on the assessment of cyber asset damage and the analysis of the impact on missions. The threat analysis and prediction step (Category 5) analyzes attacks based on attack traces and predicts the next steps. Metrics that can distinguish correlation and cyber-situation assessment tools for each category are also proposed.

  7. Correlation Between the System Capabilities Analytic Process (SCAP) and the Missions and Means Framework (MMF)

    DTIC Science & Technology

    2013-05-01

    specifics of the correlation will be explored followed by discussion of new paradigms— the ordered event list (OEL) and the decision tree — that result from...4.2.1  Brief Overview of the Decision Tree Paradigm ................................................15  4.2.2  OEL Explained...6  Figure 3. A depiction of a notional fault/activation tree . ................................................................7

  8. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    NASA Astrophysics Data System (ADS)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  9. Integrating social networks and human social motives to achieve social influence at scale

    PubMed Central

    Contractor, Noshir S.; DeChurch, Leslie A.

    2014-01-01

    The innovations of science often point to ideas and behaviors that must spread and take root in communities to have impact. Ideas, practices, and behaviors need to go from accepted truths on the part of a few scientists to commonplace beliefs and norms in the minds of the many. Moving from scientific discoveries to public good requires social influence. We introduce a structured influence process (SIP) framework to explain how social networks (i.e., the structure of social influence) and human social motives (i.e., the process of social influence wherein one person’s attitudes and behaviors affect another’s) are used collectively to enact social influence within a community. The SIP framework advances the science of scientific communication by positing social influence events that consider both the “who” and the “how” of social influence. This framework synthesizes core ideas from two bodies of research on social influence. The first is network research on social influence structures, which identifies who are the opinion leaders and who among their network of peers shapes their attitudes and behaviors. The second is research on social influence processes in psychology, which explores how human social motives such as the need for accuracy or the need for affiliation stimulate behavior change. We illustrate the practical implications of the SIP framework by applying it to the case of reducing neonatal mortality in India. PMID:25225373

  10. Integrating social networks and human social motives to achieve social influence at scale.

    PubMed

    Contractor, Noshir S; DeChurch, Leslie A

    2014-09-16

    The innovations of science often point to ideas and behaviors that must spread and take root in communities to have impact. Ideas, practices, and behaviors need to go from accepted truths on the part of a few scientists to commonplace beliefs and norms in the minds of the many. Moving from scientific discoveries to public good requires social influence. We introduce a structured influence process (SIP) framework to explain how social networks (i.e., the structure of social influence) and human social motives (i.e., the process of social influence wherein one person's attitudes and behaviors affect another's) are used collectively to enact social influence within a community. The SIP framework advances the science of scientific communication by positing social influence events that consider both the "who" and the "how" of social influence. This framework synthesizes core ideas from two bodies of research on social influence. The first is network research on social influence structures, which identifies who are the opinion leaders and who among their network of peers shapes their attitudes and behaviors. The second is research on social influence processes in psychology, which explores how human social motives such as the need for accuracy or the need for affiliation stimulate behavior change. We illustrate the practical implications of the SIP framework by applying it to the case of reducing neonatal mortality in India.

  11. Neural mechanisms of planning: A computational analysis using event-related fMRI

    PubMed Central

    Fincham, Jon M.; Carter, Cameron S.; van Veen, Vincent; Stenger, V. Andrew; Anderson, John R.

    2002-01-01

    To investigate the neural mechanisms of planning, we used a novel adaptation of the Tower of Hanoi (TOH) task and event-related functional MRI. Participants were trained in applying a specific strategy to an isomorph of the five-disk TOH task. After training, participants solved novel problems during event-related functional MRI. A computational cognitive model of the task was used to generate a reference time series representing the expected blood oxygen level-dependent response in brain areas involved in the manipulation and planning of goals. This time series was used as one term within a general linear modeling framework to identify brain areas in which the time course of activity varied as a function of goal-processing events. Two distinct time courses of activation were identified, one in which activation varied parametrically with goal-processing operations, and the other in which activation became pronounced only during goal-processing intensive trials. Regions showing the parametric relationship comprised a frontoparietal system and include right dorsolateral prefrontal cortex [Brodmann's area (BA 9)], bilateral parietal (BA 40/7), and bilateral premotor (BA 6) areas. Regions preferentially engaged only during goal-intensive processing include left inferior frontal gyrus (BA 44). The implications of these results for the current model, as well as for our understanding of the neural mechanisms of planning and functional specialization of the prefrontal cortex, are discussed. PMID:11880658

  12. Perspectives of intellectual processing of large volumes of astronomical data using neural networks

    NASA Astrophysics Data System (ADS)

    Gorbunov, A. A.; Isaev, E. A.; Samodurov, V. A.

    2018-01-01

    In the process of astronomical observations vast amounts of data are collected. BSA (Big Scanning Antenna) LPI used in the study of impulse phenomena, daily logs 87.5 GB of data (32 TB per year). This data has important implications for both short-and long-term monitoring of various classes of radio sources (including radio transients of different nature), monitoring the Earth’s ionosphere, the interplanetary and the interstellar plasma, the search and monitoring of different classes of radio sources. In the framework of the studies discovered 83096 individual pulse events (in the interval of the study highlighted July 2012 - October 2013), which may correspond to pulsars, twinkling springs, and a rapid radio transients. Detected impulse events are supposed to be used to filter subsequent observations. The study suggests approach, using the creation of the multilayered artificial neural network, which processes the input raw data and after processing, by the hidden layer, the output layer produces a class of impulsive phenomena.

  13. Brain substrates of implicit and explicit memory: the importance of concurrently acquired neural signals of both memory types.

    PubMed

    Voss, Joel L; Paller, Ken A

    2008-11-01

    A comprehensive understanding of human memory requires cognitive and neural descriptions of memory processes along with a conception of how memory processing drives behavioral responses and subjective experiences. One serious challenge to this endeavor is that an individual memory process is typically operative within a mix of other contemporaneous memory processes. This challenge is particularly disquieting in the context of implicit memory, which, unlike explicit memory, transpires without the subject necessarily being aware of memory retrieval. Neural correlates of implicit memory and neural correlates of explicit memory are often investigated in different experiments using very different memory tests and procedures. This strategy poses difficulties for elucidating the interactions between the two types of memory process that may result in explicit remembering, and for determining the extent to which certain neural processing events uniquely contribute to only one type of memory. We review recent studies that have succeeded in separately assessing neural correlates of both implicit memory and explicit memory within the same paradigm using event-related brain potentials (ERPs) and functional magnetic resonance imaging (fMRI), with an emphasis on studies from our laboratory. The strategies we describe provide a methodological framework for achieving valid assessments of memory processing, and the findings support an emerging conceptualization of the distinct neurocognitive events responsible for implicit and explicit memory.

  14. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  15. Cognitive Invariants of Geographic Event Conceptualization: What Matters and What Refines?

    NASA Astrophysics Data System (ADS)

    Klippel, Alexander; Li, Rui; Hardisty, Frank; Weaver, Chris

    Behavioral experiments addressing the conceptualization of geographic events are few and far between. Our research seeks to address this deficiency by developing an experimental framework on the conceptualization of movement patterns. In this paper, we report on a critical experiment that is designed to shed light on the question of cognitively salient invariants in such conceptualization. Invariants have been identified as being critical to human information processing, particularly for the processing of dynamic information. In our experiment, we systematically address cognitive invariants of one class of geographic events: single entity movement patterns. To this end, we designed 72 animated icons that depict the movement patterns of hurricanes around two invariants: size difference and topological equivalence class movement patterns endpoints. While the endpoint hypothesis, put forth by Regier (2007), claims a particular focus of human cognition to ending relations of events, other research suggests that simplicity principles guide categorization and, additionally, that static information is easier to process than dynamic information. Our experiments show a clear picture: Size matters. Nonetheless, we also find categorization behaviors consistent with experiments in both the spatial and temporal domain, namely that topology refines these behaviors and that topological equivalence classes are categorized consistently. These results are critical steppingstones in validating spatial formalism from a cognitive perspective and cognitively grounding work on ontologies.

  16. A decision framework for coordinating bioterrorism planning: lessons from the BioNet program.

    PubMed

    Manley, Dawn K; Bravata, Dena M

    2009-01-01

    Effective disaster preparedness requires coordination across multiple organizations. This article describes a detailed framework developed through the BioNet program to facilitate coordination of bioterrorism preparedness planning among military and civilian decision makers. The authors and colleagues conducted a series of semistructured interviews with civilian and military decision makers from public health, emergency management, hazardous material response, law enforcement, and military health in the San Diego area. Decision makers used a software tool that simulated a hypothetical anthrax attack, which allowed them to assess the effects of a variety of response actions (eg, issuing warnings to the public, establishing prophylaxis distribution centers) on performance metrics. From these interviews, the authors characterized the information sources, technologies, plans, and communication channels that would be used for bioterrorism planning and responses. The authors used influence diagram notation to describe the key bioterrorism response decisions, the probabilistic factors affecting these decisions, and the response outcomes. The authors present an overview of the response framework and provide a detailed assessment of two key phases of the decision-making process: (1) pre-event planning and investment and (2) incident characterization and initial responsive measures. The framework enables planners to articulate current conditions; identify gaps in existing policies, technologies, information resources, and relationships with other response organizations; and explore the implications of potential system enhancements. Use of this framework could help decision makers execute a locally coordinated response by identifying the critical cues of a potential bioterrorism event, the information needed to make effective response decisions, and the potential effects of various decision alternatives.

  17. MADANALYSIS 5, a user-friendly framework for collider phenomenology

    NASA Astrophysics Data System (ADS)

    Conte, Eric; Fuks, Benjamin; Serret, Guillaume

    2013-01-01

    We present MADANALYSIS 5, a new framework for phenomenological investigations at particle colliders. Based on a C++ kernel, this program allows us to efficiently perform, in a straightforward and user-friendly fashion, sophisticated physics analyses of event files such as those generated by a large class of Monte Carlo event generators. MADANALYSIS 5 comes with two modes of running. The first one, easier to handle, uses the strengths of a powerful PYTHON interface in order to implement physics analyses by means of a set of intuitive commands. The second one requires one to implement the analyses in the C++ programming language, directly within the core of the analysis framework. This opens unlimited possibilities concerning the level of complexity which can be reached, being only limited by the programming skills and the originality of the user. Program summaryProgram title: MadAnalysis 5 Catalogue identifier: AENO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENO_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Permission to use, copy, modify and distribute this program is granted under the terms of the GNU General Public License. No. of lines in distributed program, including test data, etc.: 31087 No. of bytes in distributed program, including test data, etc.: 399105 Distribution format: tar.gz Programming language: PYTHON, C++. Computer: All platforms on which Python version 2.7, Root version 5.27 and the g++ compiler are available. Compatibility with newer versions of these programs is also ensured. However, the Python version must be below version 3.0. Operating system: Unix, Linux and Mac OS operating systems on which the above-mentioned versions of Python and Root, as well as g++, are available. Classification: 11.1. External routines: ROOT (http://root.cern.ch/drupal/) Nature of problem: Implementing sophisticated phenomenological analyses in high-energy physics through a flexible, efficient and straightforward fashion, starting from event files such as those produced by Monte Carlo event generators. The event files can have been matched or not to parton-showering and can have been processed or not by a (fast) simulation of a detector. According to the sophistication level of the event files (parton-level, hadron-level, reconstructed-level), one must note that several input formats are possible. Solution method: We implement an interface allowing the production of predefined as well as user-defined histograms for a large class of kinematical distributions after applying a set of event selection cuts specified by the user. This therefore allows us to devise robust and novel search strategies for collider experiments, such as those currently running at the Large Hadron Collider at CERN, in a very efficient way. Restrictions: Unsupported event file format. Unusual features: The code is fully based on object representations for events, particles, reconstructed objects and cuts, which facilitates the implementation of an analysis. Running time: It depends on the purposes of the user and on the number of events to process. It varies from a few seconds to the order of the minute for several millions of events.

  18. Identifying and Mitigating the Impacts of Climate Change on Heritage Assets from Site to Catchment-Scale : Developing Landscape Analysis Toolkits within Geoarchaeological Frameworks.An example from the Trent catchment, UK

    NASA Astrophysics Data System (ADS)

    Howard, Andy; Knight, David

    2016-04-01

    In the UK, the devastating floods of the last few years, both summer and winter, have bought sharply into focus the changing nature of weather patterns, as well as the challenges of future flood risk management under such extreme scenarios. Inevitably, when such disasters happen, focus is often placed on individual localities or groups of built assets, as well as the development of solutions that consider contemporary and modelled future geomorphological processes. Whilst the impact of these major floods on heritage assets has gained some prominence in the media, often due to failure of historic bridges, the majority of the damage to the Historic Record goes unrecognised, since its impact is on (invisible) subsurface remains. As well as being directly affected by these flood events, identifying the character of heritage assets within river catchments has the potential to inform landscape managers of past climatic and environmental changes and human response to key geomorphic processes and events. Particularly in industrial landscapes, it also has the potential to identify the legacy of past pollution that can have significant impacts on ecosystems and future geomorphic thresholds. Clearly, whilst the historic environment record has the potential to greatly inform environmental managers, it is important that those responsible for providing such information (i.e. the archaeological community), take a holistic approach to examining landscapes within clearly identified research frameworks that provide equal weight to individual sites and more expansive terrain units. This paper provides an example of such a framework developed through a number of Historic England funded initiatives in the Trent catchment, UK, which have helped to develop toolkits to characterise geoarchaeological resources, consider their potential for informing environmental managers about past landscape change and therefore offer the potential to shape policy and societal response to future events.

  19. Linking MedDRA®-coded Clinical Phenotypes to Biological Mechanisms by The Ontology of Adverse Events: A pilot study on Tyrosine Kinase Inhibitors (TKIs)

    PubMed Central

    Sarntivijai, Sirarat; Zhang, Shelley; Jagannathan, Desikan G.; Zaman, Shadia; Burkhart, Keith K.; Omenn, Gilbert S.; He, Yongqun; Athey, Brian D.; Abernethy, Darrell R.

    2016-01-01

    Introduction A translational bioinformatics challenge lies in connecting population and individual’s clinical phenotypes in various formats to biological mechanisms. The Medical Dictionary for Regulatory Activities (MedDRA®) is the default dictionary for Adverse Event (AE) reporting in the FDA Adverse Event Reporting System (FAERS). The Ontology of Adverse Events (OAE) represents AEs as pathological processes occurring after drug exposures. Objectives The aim is to establish a semantic framework to link biological mechanisms to phenotypes of AEs by combining OAE with MedDRA® in FAERS data analysis. We investigated the AEs associated with Tyrosine Kinase Inhibitors (TKIs) and monoclonal antibodies (mAbs) targeting tyrosine kinases. The selected 5 TKIs/mAbs (i.e., dasatinib, imatinib, lapatinib, cetuximab, and trastuzumab) are known to induce impaired ventricular function (non-QT) cardiotoxicity. Results Statistical analysis of FAERS data identified 1,053 distinct MedDRA® terms significantly associated with TKIs/mAbs, where 884 did not have corresponding OAE terms. We manually annotated these terms, added them to OAE by the standard OAE development strategy, and mapped them to MedDRA®. The data integration to provide insights into molecular mechanisms for drug-associated AEs is performed by including linkages in OAE for all related AE terms to MedDRA® and existing ontologies including Human Phenotype Ontology (HP), Uber Anatomy Ontology (UBERON), and Gene Ontology (GO). Sixteen AEs are shared by all 5 TKIs/mAbs, and each of 17 cardiotoxicity AEs was associated with at least one TKI/mAb. As an example, we analyzed ‘cardiac failure’ using the relations established in OAE with other ontologies, and demonstrated that one of the biological processes associated with cardiac failure maps to the genes associated with heart contraction. Conclusion By expanding existing OAE ontological design, our TKI use case demonstrates that the combination of OAE and MedDRA® provides a semantic framework to link clinical phenotypes of adverse drug events to biological mechanisms. PMID:27003817

  20. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.

    PubMed

    Martínez-Aquino, Andrés

    2016-08-01

    Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.

  1. Accounting for the social triggers of sexual compulsivity.

    PubMed

    Parsons, Jeffrey T; Kelly, Brian C; Bimbi, David S; Muench, Frederick; Morgenstern, Jon

    2007-01-01

    To examine the social triggers of sexual compulsivity amongst a diverse sample of gay and bisexual men. Qualitative interviews were conducted with 180 gay and bisexual men in the United States who self-identified that their sex lives were spinning out of control. The data were analyzed using a grounded theory approach to explore the range of social triggers that were driving sexual compulsions. An open-ended interview and a structured clinical interview were conducted with each participant. The interviews examined their experiences with sexual compulsivity over time and the impact of their problematic sexual behaviors on their lives. Two types of social triggers emerged from the data: event-centered triggers and contextual triggers. Event-centered triggers arise from sudden, unforeseen events. Two major event-centered triggers were identified: relationship turmoil and catastrophes. Contextual triggers, on the other hand, have a certain element of predictability, and included such things as location, people, the use of drugs, and pornography. This framework of triggers has clinical implications for the prevention and treatment of sexual compulsivity. Clinicians can utilize the framework of social triggers in the therapeutic process to provide insight into ways to effectively work through symptoms of sexual compulsivity. Awareness of the contextual aspects of sexual compulsivity may be critical to understanding the behaviors of sexually compulsive clients. Thus, therapeutic assessments should focus upon the social context in addition to the psychological components of the disorder.

  2. Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees

    PubMed Central

    2016-01-01

    Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928

  3. Applied Use of Safety Event Occurrence Control Charts of Harm and Non-Harm Events: A Case Study.

    PubMed

    Robinson, Susan N; Neyens, David M; Diller, Thomas

    Most hospitals use occurrence reporting systems that facilitate identifying serious events that lead to root cause investigations. Thus, the events catalyze improvement efforts to mitigate patient harm. A serious limitation is that only a few of the occurrences are investigated. A challenge is leveraging the data to generate knowledge. The goal is to present a methodology to supplement these incident assessment efforts. The framework affords an enhanced understanding of patient safety through the use of control charts to monitor non-harm and harm incidents simultaneously. This approach can identify harm and non-harm reporting rates and also can facilitate monitoring occurrence trends. This method also can expedite identifying changes in workflow, processes, or safety culture. Although unable to identify root causes, this approach can identify changes in near real time. This approach also supports evaluating safety or policy interventions that may not be observable in annual safety climate surveys.

  4. Testing the event witnessing status of micro-bloggers from evidence in their micro-blogs

    PubMed Central

    2017-01-01

    This paper demonstrates a framework of processes for identifying potential witnesses of events from evidence they post to social media. The research defines original evidence models for micro-blog content sources, the relative uncertainty of different evidence types, and models for testing evidence by combination. Methods to filter and extract evidence using automated and semi-automated means are demonstrated using a Twitter case study event. Further, an implementation to test extracted evidence using Dempster Shafer Theory of Evidence are presented. The results indicate that the inclusion of evidence from micro-blog text and linked image content can increase the number of micro-bloggers identified at events, in comparison to the number of micro-bloggers identified from geotags alone. Additionally, the number of micro-bloggers that can be tested for evidence corroboration or conflict, is increased by incorporating evidence identified in their posting history. PMID:29232395

  5. Thermodynamics of rare events and impulsive relaxation events in the magnetospheric substorm dynamics

    NASA Astrophysics Data System (ADS)

    Consolini, Giuseppe; Kretzschmar, Matthieu

    2007-12-01

    The magnetosphere dynamics shows fast relaxation events following power-law distribution for many observable quantities during magnetic substorms. The emergence of such power-law distributions has been widely discussed in the framework of self-organized criticality and/or turbulence. Here, a different approach to the statistical features of these impulsive dynamical events is proposed in the framework of the thermodynamics of rare events [Lavenda, B.H., Florio, A., 1992. Thermodynamics of rare events, Int. J. Theor. Phys. 31, 1455-1475; Lavenda, B.H., 1995. Thermodynamics of Extremes. Albion]. In detail, an application of such a novel approach to the magnetospheric substorm avalanching dynamics as monitored by the auroral electroject index is discussed.

  6. Brains are not just neurons. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by Fitch

    NASA Astrophysics Data System (ADS)

    Huber, Ludwig

    2014-09-01

    This comment addresses the first component of Fitch's framework: the computational power of single neurons [3]. Although I agree that traditional models of neural computation have vastly underestimated the computational power of single neurons, I am hesitant to follow him completely. The exclusive focus on neurons is likely to underestimate the importance of other cells in the brain. In the last years, two such cell types have received appropriate attention by neuroscientists: interneurons and glia. Interneurons are small, tightly packed cells involved in the control of information processing in learning and memory. Rather than transmitting externally (like motor or sensory neurons), these neurons process information within internal circuits of the brain (therefore also called 'relay neurons'). Some specialized interneuron subtypes temporally regulate the flow of information in a given cortical circuit during relevant behavioral events [4]. In the human brain approx. 100 billion interneurons control information processing and are implicated in disorders such as epilepsy and Parkinson's.

  7. Online Meta-data Collection and Monitoring Framework for the STAR Experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.; Van Buren, G.

    2012-12-01

    The STAR Experiment further exploits scalable message-oriented model principles to achieve a high level of control over online data streams. In this paper we present an AMQP-powered Message Interface and Reliable Architecture framework (MIRA), which allows STAR to orchestrate the activities of Meta-data Collection, Monitoring, Online QA and several Run-Time and Data Acquisition system components in a very efficient manner. The very nature of the reliable message bus suggests parallel usage of multiple independent storage mechanisms for our meta-data. We describe our experience with a robust data-taking setup employing MySQL- and HyperTable-based archivers for meta-data processing. In addition, MIRA has an AJAX-enabled web GUI, which allows real-time visualisation of online process flow and detector subsystem states, and doubles as a sophisticated alarm system when combined with complex event processing engines like Esper, Borealis or Cayuga. The performance data and our planned path forward are based on our experience during the 2011-2012 running of STAR.

  8. Arden Syntax Clinical Foundation Framework for Event Monitoring in Intensive Care Units: Report on a Pilot Study.

    PubMed

    de Bruin, Jeroen S; Zeckl, Julia; Adlassnig, Katharina; Blacky, Alexander; Koller, Walter; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2017-01-01

    The creation of clinical decision support systems has received a strong impulse over the last years, but their integration into a clinical routine has lagged behind, partly due to a lack of interoperability and trust by physicians. We report on the implementation of a clinical foundation framework in Arden Syntax, comprising knowledge units for (a) preprocessing raw clinical data, (b) the determination of single clinical concepts, and (c) more complex medical knowledge, which can be modeled through the composition and configuration of knowledge units in this framework. Thus, it can be tailored to clinical institutions or patients' caregivers. In the present version, we integrated knowledge units for several infection-related clinical concepts into the framework and developed a clinical event monitoring system over the framework that employs three different scenarios for monitoring clinical signs of bloodstream infection. The clinical event monitoring system was tested using data from intensive care units at Vienna General Hospital, Austria.

  9. Statistical framework for detection of genetically modified organisms based on Next Generation Sequencing.

    PubMed

    Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy

    2016-02-01

    Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Biological hierarchies and the nature of extinction.

    PubMed

    Congreve, Curtis R; Falk, Amanda R; Lamsdell, James C

    2018-05-01

    Hierarchy theory recognises that ecological and evolutionary units occur in a nested and interconnected hierarchical system, with cascading effects occurring between hierarchical levels. Different biological disciplines have routinely come into conflict over the primacy of different forcing mechanisms behind evolutionary and ecological change. These disconnects arise partly from differences in perspective (with some researchers favouring ecological forcing mechanisms while others favour developmental/historical mechanisms), as well as differences in the temporal framework in which workers operate. In particular, long-term palaeontological data often show that large-scale (macro) patterns of evolution are predominantly dictated by shifts in the abiotic environment, while short-term (micro) modern biological studies stress the importance of biotic interactions. We propose that thinking about ecological and evolutionary interactions in a hierarchical framework is a fruitful way to resolve these conflicts. Hierarchy theory suggests that changes occurring at lower hierarchical levels can have unexpected, complex effects at higher scales due to emergent interactions between simple systems. In this way, patterns occurring on short- and long-term time scales are equally valid, as changes that are driven from lower levels will manifest in different forms at higher levels. We propose that the dual hierarchy framework fits well with our current understanding of evolutionary and ecological theory. Furthermore, we describe how this framework can be used to understand major extinction events better. Multi-generational attritional loss of reproductive fitness (MALF) has recently been proposed as the primary mechanism behind extinction events, whereby extinction is explainable solely through processes that result in extirpation of populations through a shutdown of reproduction. While not necessarily explicit, the push to explain extinction through solely population-level dynamics could be used to suggest that environmentally mediated patterns of extinction or slowed speciation across geological time are largely artefacts of poor preservation or a coarse temporal scale. We demonstrate how MALF fits into a hierarchical framework, showing that MALF can be a primary forcing mechanism at lower scales that still results in differential survivorship patterns at the species and clade level which vary depending upon the initial environmental forcing mechanism. Thus, even if MALF is the primary mechanism of extinction across all mass extinction events, the primary environmental cause of these events will still affect the system and result in differential responses. Therefore, patterns at both temporal scales are relevant. © 2017 Cambridge Philosophical Society.

  11. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  12. The LHCb software and computing upgrade for Run 3: opportunities and challenges

    NASA Astrophysics Data System (ADS)

    Bozzi, C.; Roiser, S.; LHCb Collaboration

    2017-10-01

    The LHCb detector will be upgraded for the LHC Run 3 and will be readout at 30 MHz, corresponding to the full inelastic collision rate, with major implications on the full software trigger and offline computing. If the current computing model and software framework are kept, the data storage capacity and computing power required to process data at this rate, and to generate and reconstruct equivalent samples of simulated events, will exceed the current capacity by at least one order of magnitude. A redesign of the software framework, including scheduling, the event model, the detector description and the conditions database, is needed to fully exploit the computing power of multi-, many-core architectures, and coprocessors. Data processing and the analysis model will also change towards an early streaming of different data types, in order to limit storage resources, with further implications for the data analysis workflows. Fast simulation options will allow to obtain a reasonable parameterization of the detector response in considerably less computing time. Finally, the upgrade of LHCb will be a good opportunity to review and implement changes in the domains of software design, test and review, and analysis workflow and preservation. In this contribution, activities and recent results in all the above areas are presented.

  13. Modelling the evolution and diversity of cumulative culture

    PubMed Central

    Enquist, Magnus; Ghirlanda, Stefano; Eriksson, Kimmo

    2011-01-01

    Previous work on mathematical models of cultural evolution has mainly focused on the diffusion of simple cultural elements. However, a characteristic feature of human cultural evolution is the seemingly limitless appearance of new and increasingly complex cultural elements. Here, we develop a general modelling framework to study such cumulative processes, in which we assume that the appearance and disappearance of cultural elements are stochastic events that depend on the current state of culture. Five scenarios are explored: evolution of independent cultural elements, stepwise modification of elements, differentiation or combination of elements and systems of cultural elements. As one application of our framework, we study the evolution of cultural diversity (in time as well as between groups). PMID:21199845

  14. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the modelmore » size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.« less

  15. Why Consumers Misattribute Sponsorships to Non-Sponsor Brands: Differential Roles of Item and Relational Communications.

    PubMed

    Weeks, Clinton S; Humphreys, Michael S; Cornwell, T Bettina

    2018-02-01

    Brands engaged in sponsorship of events commonly have objectives that depend on consumer memory for the sponsor-event relationship (e.g., sponsorship awareness). Consumers however, often misattribute sponsorships to nonsponsor competitor brands, indicating erroneous memory for these relationships. The current research uses an item and relational memory framework to reveal sponsor brands may inadvertently foster this misattribution when they communicate relational linkages to events. Effects can be explained via differential roles of communicating item information (information that supports processing item distinctiveness) versus relational information (information that supports processing relationships among items) in contributing to memory outcomes. Experiment 1 uses event-cued brand recall to show that correct memory retrieval is best supported by communicating relational information when sponsorship relationships are not obvious (low congruence). In contrast, correct retrieval is best supported by communicating item information when relationships are obvious (high congruence). Experiment 2 uses brand-cued event recall to show that, against conventional marketing recommendations, relational information increases misattribution, whereas item information guards against misattribution. Results suggest sponsor brands must distinguish between item and relational communications to enhance correct retrieval and limit misattribution. Methodologically, the work shows that choice of cueing direction is critical in differentially revealing patterns of correct and incorrect retrieval with pair relationships. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. An agent-based approach to modelling the effects of extreme events on global food prices

    NASA Astrophysics Data System (ADS)

    Schewe, Jacob; Otto, Christian; Frieler, Katja

    2015-04-01

    Extreme climate events such as droughts or heat waves affect agricultural production in major food producing regions and therefore can influence the price of staple foods on the world market. There is evidence that recent dramatic spikes in grain prices were at least partly triggered by actual and/or expected supply shortages. The reaction of the market to supply changes is however highly nonlinear and depends on complex and interlinked processes such as warehousing, speculation, and export restrictions. Here we present for the first time an agent-based modelling framework that accounts, in simplified terms, for these processes and allows to estimate the reaction of world food prices to supply shocks on a short (monthly) timescale. We test the basic model using observed historical supply, demand, and price data of wheat as a major food grain. Further, we illustrate how the model can be used in conjunction with biophysical crop models to assess the effect of future changes in extreme event regimes on the volatility of food prices. In particular, the explicit representation of storage dynamics makes it possible to investigate the potentially nonlinear interaction between simultaneous extreme events in different food producing regions, or between several consecutive events in the same region, which may both occur more frequently under future global warming.

  17. Doing Violence, Making Race: Southern Lynching and White Racial Group Formation.

    PubMed

    Smångs, Mattias

    2016-03-01

    This article presents a theoretical framework of how intergroup violence may figure into the activation and maintenance of group categories, boundaries, and identities, as well as the mediating role played by organizations in such processes. The framework's analytical advantages are demonstrated in an application to southern lynchings. Findings from event- and community-level analyses suggest that "public" lynchings, carried out by larger mobs with ceremonial violence, but not "private" ones, perpetrated by smaller bands without public or ceremonial violence, fed off and into the racial group boundaries, categories, and identities promoted by the southern Democratic Party at the turn of the 20th century and on which the emerging Jim Crow system rested. Highlighting that racialized inequalities cannot be properly understood apart from collective processes of racial group boundary and identity making, the article offers clues to the mechanisms by which past racial domination influences contemporary race relations.

  18. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  19. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps/incidents are attributed to human error. As a part of Safety within space exploration ground processing operations, the identification and/or classification of underlying contributors and causes of human error must be identified, in order to manage human error. This research provides a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  20. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    NASA Technical Reports Server (NTRS)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  1. Cause-and-effect mapping of critical events.

    PubMed

    Graves, Krisanne; Simmons, Debora; Galley, Mark D

    2010-06-01

    Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.

  2. An Integrated Ensemble-Based Operational Framework to Predict Urban Flooding: A Case Study of Hurricane Sandy in the Passaic and Hackensack River Basins

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.

    2016-12-01

    Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.

  3. Powering Up: Assessing the growing municipal energy resilience building efforts in North America

    NASA Astrophysics Data System (ADS)

    Schimmelfing, Kara

    Energy related shortages and price volatilities can impact all levels of society. With coming fossil fuel depletion related to peak oil, it is expected these shortages and volatilities will increase in frequency, duration, and intensity. Resilience building is a strategy to minimize the effects of these events by modifying systems so they are less impacted and/or recover more quickly from disruptive events. Resilience building is being used, particularly at the municipal scale, to prepare for these coming energy related changes. These municipal efforts have only been in existence for five to ten years, and full implementation is still in progress. Little evaluation has been done of these municipal efforts to date, particularly in North America. Despite this, it is important to begin to assess the effectiveness of these efforts now. As a result, future efforts can be redirected to address weak areas and that lessons learned by vanguard communities can be applied in other communities attempting to build energy resilience in the future. This thesis involved the creation of a hybrid framework to evaluate municipal energy resilience building efforts. The framework drew primarily from planning process and factors identified as important to build resilience in social-ecological systems. It consisted of the following categories to group resilience building efforts: Economy, Resource Systems & Infrastructure, Public Awareness, Social Services, Transportation, Built Environment, and Natural Environment. Within these categories the following process steps should be observed: Context, Goals, Needs, Processes, and Outcomes. This framework was then tested through application to four case-study communities (Bloomington, IN, Hamilton, ON, Oakland, CA, Victoria, BC) currently pursuing energy resilience building efforts in North America. This qualitative research involved document analysis primarily of municipal documents related to energy planning efforts. Supplementary interviews were also conducted to verify the findings from the documents and illuminate anything not captured by them. Once data was collected, categorized and analyzed using the framework, comparisons were made between case-studies. Results showed the framework to be a successful, but time consuming, tool for assessing municipal energy resilience building. Four revisions are recommended for the framework before further research. Analysis of the case study communities' efforts also identified five factors for recommended for other communities attempting energy resilience planning at the municipal scale: consistent support from within the municipality, integration and information sharing, presence of key resources, access to information on energy use, and a two-tier planning process. Ultimately, as this is a preliminary attempt to address a young and growing area of municipal effort, there are many avenues for further research to build on the work of this thesis.

  4. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  5. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-06-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.

  6. The fate of memory: Reconsolidation and the case of Prediction Error.

    PubMed

    Fernández, Rodrigo S; Boccia, Mariano M; Pedreira, María E

    2016-09-01

    The ability to make predictions based on stored information is a general coding strategy. A Prediction-Error (PE) is a mismatch between expected and current events. It was proposed as the process by which memories are acquired. But, our memories like ourselves are subject to change. Thus, an acquired memory can become active and update its content or strength by a labilization-reconsolidation process. Within the reconsolidation framework, PE drives the updating of consolidated memories. Moreover, memory features, such as strength and age, are crucial boundary conditions that limit the initiation of the reconsolidation process. In order to disentangle these boundary conditions, we review the role of surprise, classical models of conditioning, and their neural correlates. Several forms of PE were found to be capable of inducing memory labilization-reconsolidation. Notably, many of the PE findings mirror those of memory-reconsolidation, suggesting a strong link between these signals and memory process. Altogether, the aim of the present work is to integrate a psychological and neuroscientific analysis of PE into a general framework for memory-reconsolidation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A Python object-oriented framework for the CMS alignment and calibration data

    NASA Astrophysics Data System (ADS)

    Dawes, Joshua H.; CMS Collaboration

    2017-10-01

    The Alignment, Calibrations and Databases group at the CMS Experiment delivers Alignment and Calibration Conditions Data to a large set of workflows which process recorded event data and produce simulated events. The current infrastructure for releasing and consuming Conditions Data was designed in the two years of the first LHC long shutdown to respond to use cases from the preceding data-taking period. During the second run of the LHC, new use cases were defined. For the consumption of Conditions Metadata, no common interface existed for the detector experts to use in Python-based custom scripts, resulting in many different querying and transaction management patterns. A new framework has been built to address such use cases: a simple object-oriented tool that detector experts can use to read and write Conditions Metadata when using Oracle and SQLite databases, that provides a homogeneous method of querying across all services. The tool provides mechanisms for segmenting large sets of conditions while releasing them to the production database, allows for uniform error reporting to the client-side from the server-side and optimizes the data transfer to the server. The architecture of the new service has been developed exploiting many of the features made available by the metadata consumption framework to implement the required improvements. This paper presents the details of the design and implementation of the new metadata consumption and data upload framework, as well as analyses of the new upload service’s performance as the server-side state varies.

  8. Developing a Framework for Seamless Prediction of Sub-Seasonal to Seasonal Extreme Precipitation Events in the United States.

    NASA Astrophysics Data System (ADS)

    Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.

    2017-12-01

    Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.

  9. Mississippi River delta plain, Louisiana coast, and inner shelf Holocene geologic framework, processes, and resources

    USGS Publications Warehouse

    Williams, S. Jeffress; Kulp, Mark; Penland, Shea; Kindinger, Jack L.; Flocks, James G.; Buster, Noreen A.; Holmes, Charles W.

    2009-01-01

    Extending nearly 400 km from Sabine Pass on the Texas-Louisiana border east to the Chandeleur Islands, the Louisiana coastal zone (Fig. 11.1) along the north-central Gulf of Mexico is the southern terminus of the largest drainage basin in North America (>3.3 million km2), which includes the Mississippi River delta plain where approximately 6.2 million kilograms per year of sediment is delivered to the Gulf of Mexico (Coleman 1988). The Mississippi River, active since at least Late Jurassic time (Mann and Thomas 1968), is the main distributary channel of this drainage system and during the Holocene has constructed one of the largest delta plains in the world, larger than 30,000 km2 (Coleman and Prior 1980; Coleman 1981; Coleman et al. 1998). The subsurface geology and geomorphology of the Louisiana coastal zone reffects a complex history of regional tectonic events and fluvial, deltaic, and marine sedimentary processes affected by large sea-level fluctuations. Despite the complex geology of the north-central Gulf basin, a long history of engineering studies and Scientific research investigations (see table 11.1) has led to substantial knowledge of the geologic framework and evolution of the delta plain region (see also Bird et al., chapter 1 in this volume). Mississippi River delta plain, Louisiana coast, and inner shelf Holocene geologic framework, processes, and resources. Available from: https://www.researchgate.net/publication/262802561_Mississippi_River_delta_plain_Louisiana_coast_and_inner_shelf_Holocene_geologic_framework_processes_and_resources [accessed Sep 13, 2017].

  10. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  11. Development of High Level Trigger Software for Belle II at SuperKEKB

    NASA Astrophysics Data System (ADS)

    Lee, S.; Itoh, R.; Katayama, N.; Mineo, S.

    2011-12-01

    The Belle collaboration has been trying for 10 years to reveal the mystery of the current matter-dominated universe. However, much more statistics is required to search for New Physics through quantum loops in decays of B mesons. In order to increase the experimental sensitivity, the next generation B-factory, SuperKEKB, is planned. The design luminosity of SuperKEKB is 8 x 1035cm-2s-1 a factor 40 above KEKB's peak luminosity. At this high luminosity, the level 1 trigger of the Belle II experiment will stream events of 300 kB size at a 30 kHz rate. To reduce the data flow to a manageable level, a high-level trigger (HLT) is needed, which will be implemented using the full offline reconstruction on a large scale PC farm. There, physics level event selection is performed, reducing the event rate by ~ 10 to a few kHz. To execute the reconstruction the HLT uses the offline event processing framework basf2, which has parallel processing capabilities used for multi-core processing and PC clusters. The event data handling in the HLT is totally object oriented utilizing ROOT I/O with a new method of object passing over the UNIX socket connection. Also under consideration is the use of the HLT output as well to reduce the pixel detector event size by only saving hits associated with a track, resulting in an additional data reduction of ~ 100 for the pixel detector. In this contribution, the design and implementation of the Belle II HLT are presented together with a report of preliminary testing results.

  12. Use of Archived Information by the United States National Data Center

    NASA Astrophysics Data System (ADS)

    Junek, W. N.; Pope, B. M.; Roman-Nieves, J. I.; VanDeMark, T. F.; Ichinose, G. A.; Poffenberger, A.; Woods, M. T.

    2012-12-01

    The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties, acquiring data and data products from the International Data Center (IDC), and distributing data according to established policy. The archive of automated and reviewed event solutions residing at the US NDC is a valuable resource for assessing and improving the performance of signal detection, event formation, location, and discrimination algorithms. Numerous research initiatives are currently underway that are focused on optimizing these processes using historic waveform data and alphanumeric information. Identification of optimum station processing parameters is routinely performed through the analysis of archived waveform data. Station specific detector tuning studies produce and compare receiver operating characteristics for multiple detector configurations (e.g., detector type, filter passband) to identify an optimum set of processing parameters with an acceptable false alarm rate. Large aftershock sequences can inundate automated phase association algorithms with numerous detections that are closely spaced in time, which increases the number of false and/or mixed associations in automated event solutions and increases analyst burden. Archived waveform data and alphanumeric information are being exploited to develop an aftershock processor that will construct association templates to assist the Global Association (GA) application, reduce the number of false and merged phase associations, and lessen analyst burden. Statistical models are being developed and evaluated for potential use by the GA application for identifying and rejecting unlikely preliminary event solutions. Other uses of archived data at the US NDC include: improved event locations using empirical travel time corrections and discrimination via a statistical framework known as the event classification matrix (ECM).

  13. Assessment of Uncertainty-Based Screening Volumes for NASA Robotic LEO and GEO Conjunction Risk Assessment

    NASA Technical Reports Server (NTRS)

    Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.

    2011-01-01

    Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.

  14. The Experience of Emotion

    PubMed Central

    Barrett, Lisa Feldman; Mesquita, Batja; Ochsner, Kevin N.; Gross, James J.

    2007-01-01

    Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior. PMID:17002554

  15. The effects of climatic fluctuations and extreme events on running water ecosystems

    PubMed Central

    Woodward, Guy; Bonada, Núria; Brown, Lee E.; Death, Russell G.; Durance, Isabelle; Gray, Clare; Hladyz, Sally; Ledger, Mark E.; Milner, Alexander M.; Ormerod, Steve J.; Thompson, Ross M.

    2016-01-01

    Most research on the effects of environmental change in freshwaters has focused on incremental changes in average conditions, rather than fluctuations or extreme events such as heatwaves, cold snaps, droughts, floods or wildfires, which may have even more profound consequences. Such events are commonly predicted to increase in frequency, intensity and duration with global climate change, with many systems being exposed to conditions with no recent historical precedent. We propose a mechanistic framework for predicting potential impacts of environmental fluctuations on running-water ecosystems by scaling up effects of fluctuations from individuals to entire ecosystems. This framework requires integration of four key components: effects of the environment on individual metabolism, metabolic and biomechanical constraints on fluctuating species interactions, assembly dynamics of local food webs, and mapping the dynamics of the meta-community onto ecosystem function. We illustrate the framework by developing a mathematical model of environmental fluctuations on dynamically assembling food webs. We highlight (currently limited) empirical evidence for emerging insights and theoretical predictions. For example, widely supported predictions about the effects of environmental fluctuations are: high vulnerability of species with high per capita metabolic demands such as large-bodied ones at the top of food webs; simplification of food web network structure and impaired energetic transfer efficiency; and reduced resilience and top-down relative to bottom-up regulation of food web and ecosystem processes. We conclude by identifying key questions and challenges that need to be addressed to develop more accurate and predictive bio-assessments of the effects of fluctuations, and implications of fluctuations for management practices in an increasingly uncertain world. PMID:27114576

  16. A Multi-Level Approach to Modeling Rapidly Growing Mega-Regions as a Coupled Human-Natural System

    NASA Astrophysics Data System (ADS)

    Koch, J. A.; Tang, W.; Meentemeyer, R. K.

    2013-12-01

    The FUTure Urban-Regional Environment Simulation (FUTURES) integrates information on nonstationary drivers of land change (per capita land area demand, site suitability, and spatial structure of conversion events) into spatial-temporal projections of changes in landscape patterns (Meentemeyer et al., 2013). One striking feature of FUTURES is its patch-growth algorithm that includes feedback effects of former development events across several temporal and spatial scales: cell-level transition events are aggregated into patches of land change and their further growth is based on empirically derived parameters controlling its size, shape, and dispersion. Here, we augment the FUTURES modeling framework by expanding its multilevel structure and its representation of human decision making. The new modeling framework is hierarchically organized as nested subsystems including the latest theory on telecouplings in coupled human-natural systems (Liu et al., 2013). Each subsystem represents a specific level of spatial scale and embraces agents that have decision making authority at a particular level. The subsystems are characterized with regard to their spatial representation and are connected via flows of information (e.g. regulations and policies) or material (e.g. population migration). To provide a modeling framework that is applicable to a wide range of settings and geographical regions and to keep it computationally manageable, we implement a 'zooming factor' that allows to enable or disable subsystems (and hence the represented processes), based on the extent of the study region. The implementation of the FUTURES modeling framework for a specific case study follows the observational modeling approach described in Grimm et al. (2005), starting from the analysis of empirical data in order to capture the processes relevant for specific scales and to allow a rigorous calibration and validation of the model application. In this paper, we give an introduction to the basic concept of our modeling approach and describe its strengths and weaknesses. We furthermore use empirical data for the states of North and South Carolina to demonstrate how the modeling framework can be applied to a large, heterogeneous study system with diverse decision-making agents. Grimm et al. (2005) Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology. Science 310, 987-991. Liu et al. (2013) Framing Sustainability in a Telecoupled World. Ecology and Society 18(2), 26. Meentemeyer et al. (2013) FUTURES: Multilevel Simulations of Merging Urban-Rural Landscape Structure Using a Stochastic Patch-Growing Algorithm. Annals of the Association of American Geographers 103(4), 785-807.

  17. [Consensus conference on providing information of adverse events to patients and relatives].

    PubMed

    Martín-Delgado, M C; Fernández-Maillo, M; Bañeres-Amella, J; Campillo-Artero, C; Cabré-Pericas, L; Anglés-Coll, R; Gutiérrez-Fernández, R; Aranaz-Andrés, J M; Pardo-Hernández, A; Wu, A

    2013-01-01

    To develop recommendations regarding «Information about adverse events to patients and their families», through the implementation of a consensus conference. A literature review was conducted to identify all relevant articles, the major policies and international guidelines, and the specific legislation developed in some countries on this process. The literature review was the basis for responding to a series of questions posed in a public session. A group of experts presented the best available evidence, interacting with stakeholders. At the end of the session, an interdisciplinary and multi-professional jury established the final recommendations of the consensus conference. The main recommendations advocate the need to develop policies and institutional guidelines in our field, favouring the patient adverse events disclosure process. The recommendations emphasize the need for the training of professionals in communication skills and patient safety, as well as the development of strategies for supporting professionals who are involved in an adverse event. The assessment of the interest and impact of specific legislation that would help the implementation of these policies was also considered. A cultural change is needed at all levels, nuanced and adapted to the specific social and cultural aspects of our social and health spheres, and involves all stakeholders in the system to create a framework of trust and credibility in which the processing of information about adverse events may become effective. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  18. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less

  19. Dynamical analogy between economical crisis and earthquake dynamics within the nonextensive statistical mechanics framework

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2013-07-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. Several authors have suggested that earthquake dynamics and the dynamics of economic (financial) systems can be analyzed within similar mathematical frameworks. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up with these different extreme events, in order to support the suggestion that a dynamical analogy exists between a financial crisis (in the form of share or index price collapse) and a single earthquake. We also investigate the existence of such an analogy by means of scale-free statistics (the Gutenberg-Richter distribution of event sizes). We show that the populations of: (i) fracto-electromagnetic events rooted in the activation of a single fault, emerging prior to a significant earthquake, (ii) the trade volume events of different shares/economic indices, prior to a collapse, and (iii) the price fluctuation (considered as the difference of maximum minus minimum price within a day) events of different shares/economic indices, prior to a collapse, follow both the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar parameter values. The obtained results imply the existence of a dynamic analogy between earthquakes and economic crises, which moreover follow the dynamics of seizures, magnetic storms and solar flares.

  20. Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.

    2010-11-01

    We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.

  1. Neuroscience, moral reasoning, and the law.

    PubMed

    Knabb, Joshua J; Welsh, Robert K; Ziebell, Joseph G; Reimer, Kevin S

    2009-01-01

    Modern advancements in functional magnetic resonance imaging (fMRI) technology have given neuroscientists the opportunity to more fully appreciate the brain's contribution to human behavior and decision making. Morality and moral reasoning are relative newcomers to the growing literature on decision neuroscience. With recent attention given to the salience of moral factors (e.g. moral emotions, moral reasoning) in the process of decision making, neuroscientists have begun to offer helpful frameworks for understanding the interplay between the brain, morality, and human decision making. These frameworks are relatively unfamiliar to the community of forensic psychologists, despite the fact that they offer an improved understanding of judicial decision making from a biological perspective. This article presents a framework reviewing how event-feature-emotion complexes (EFEC) are relevant to jurors and understanding complex criminal behavior. Future directions regarding converging fields of neuroscience and legal decision making are considered. Copyright 2009 John Wiley & Sons, Ltd.

  2. Defining a Computational Framework for the Assessment of ...

    EPA Pesticide Factsheets

    The Adverse Outcome Pathway (AOP) framework describes the effects of environmental stressors across multiple scales of biological organization and function. This includes an evaluation of the potential for each key event to occur across a broad range of species in order to determine the taxonomic applicability of each AOP. Computational tools are needed to facilitate this process. Recently, we developed a tool that uses sequence homology to evaluate the applicability of molecular initiating events across species (Lalone et al., Toxicol. Sci., 2016). To extend our ability to make computational predictions at higher levels of biological organization, we have created the AOPdb. This database links molecular targets identified associated with key events in the AOPwiki to publically available data (e.g. gene-protein, pathway, species orthology, ontology, chemical, disease) including ToxCast assay information. The AOPdb combines different data types in order to characterize the impacts of chemicals to human health and the environment and serves as a decision support tool for case study development in the area of taxonomic applicability. As a proof of concept, the AOPdb allows identification of relevant molecular targets, biological pathways, and chemical and disease associations across species for four AOPs from the AOP-Wiki (https://aopwiki.org): Estrogen receptor antagonism leading to reproductive dysfunction (Aop:30); Aromatase inhibition leading to reproductive d

  3. Hypothesis for cognitive effects of transcranial direct current stimulation: Externally- and internally-directed cognition.

    PubMed

    Greenwood, Pamela M; Blumberg, Eric J; Scheldrup, Melissa R

    2018-03-01

    A comprehensive explanation is lacking for the broad array of cognitive effects modulated by transcranial direct current stimulation (tDCS). We advanced the testable hypothesis that tDCS to the default mode network (DMN) increases processing of goals and stored information at the expense of external events. We further hypothesized that tDCS to the dorsal attention network (DAN) increases processing of external events at the expense of goals and stored information. A literature search (PsychINFO) identified 42 empirical studies and 3 meta-analyses examining effects of prefrontal and/or parietal tDCS on tasks that selectively required external and/or internal processing. Most, though not all, of the studies that met our search criteria supported our hypothesis. Three meta-analyses supported our hypothesis. The hypothesis we advanced provides a framework for the design and interpretation of results in light of the role of large-scale intrinsic networks that govern attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  5. The concept of relative non-locality: theoretical implications in consciousness research.

    PubMed

    Neppe, Vernon M; Close, Edward R

    2015-01-01

    We argue that "non-local" events require further descriptors for us to understand the degree of non-locality, what the framework of the observer describing it is, and where we humans are located relative to the ostensible non-locality. This suggests three critical factors: Relative to, from the framework of, and a hierarchy of "to what degree?" "Non-locality" without the prefix "relative" compromises its description by making it an absolute: We must scientifically ensure that, qualitatively, we can describe events that correspond with each other-like with like-and differentiate these events from those that are hierarchically dissimilar. Recognition of these levels of "relative non-locality" is important: Non-locality from "the general framework of" the infinite, or mystic or near-death experient, markedly differs theoretically from "relative to our sentient reality in three dimensions of space in the present moment (3S-1t)". Specific events may be described "relative to" our living 3S-1t reality, but conceptualized differently from the framework of observers in altered states of consciousness experiencing higher dimensions. Hierarchical questions to ask would include IMMEDIACY PRINCIPLE: We also propose that events happening immediately, not even requiring light speed, are fundamental properties of non-local time involving more dimensions than just 3S-1t. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Using ProMED-Mail and MedWorm blogs for cross-domain pattern analysis in epidemic intelligence.

    PubMed

    Stewart, Avaré; Denecke, Kerstin

    2010-01-01

    In this work we motivate the use of medical blog user generated content for gathering facts about disease reporting events to support biosurveillance investigation. Given the characteristics of blogs, the extraction of such events is made more difficult due to noise and data abundance. We address the problem of automatically inferring disease reporting event extraction patterns in this more noisy setting. The sublanguage used in outbreak reports is exploited to align with the sequences of disease reporting sentences in blogs. Based our Cross Domain Pattern Analysis Framework, experimental results show that Phase-Level sequences tend to produce more overlap across the domains than Word-Level sequences. The cross domain alignment process is effective at filtering noisy sequences from blogs and extracting good candidate sequence patterns from an abundance of text.

  7. Bayesian Monitoring Systems for the CTBT: Historical Development and New Results

    NASA Astrophysics Data System (ADS)

    Russell, S.; Arora, N. S.; Moore, D.

    2016-12-01

    A project at Berkeley, begun in 2009 in collaboration with CTBTO andmore recently with LLNL, has reformulated the global seismicmonitoring problem in a Bayesian framework. A first-generation system,NETVISA, has been built comprising a spatial event prior andgenerative models of event transmission and detection, as well as aMonte Carlo inference algorithm. The probabilistic model allows forseamless integration of various disparate sources of information,including negative information (the absence of detections). Workingfrom arrivals extracted by traditional station processing fromInternational Monitoring System (IMS) data, NETVISA achieves areduction of around 60% in the number of missed events compared withthe currently deployed network processing system. It also finds manyevents that are missed by the human analysts who postprocess the IMSoutput. Recent improvements include the integration of models forinfrasound and hydroacoustic detections and a global depth model fornatural seismicity trained from ISC data. NETVISA is now fullycompatible with the CTBTO operating environment. A second-generation model called SIGVISA extends NETVISA's generativemodel all the way from events to raw signal data, avoiding theerror-prone bottom-up detection phase of station processing. SIGVISA'smodel automatically captures the phenomena underlying existingdetection and location techniques such as multilateration, waveformcorrelation matching, and double-differencing, and integrates theminto a global inference process that also (like NETVISA) handles denovo events. Initial results for the Western US in early 2008 (whenthe transportable US Array was operating) shows that SIGVISA finds,from IMS data only, more than twice the number of events recorded inthe CTBTO Late Event Bulletin (LEB). For mb 1.0-2.5, the ratio is more than10; put another way, for this data set, SIGVISA lowers the detectionthreshold by roughly one magnitude compared to LEB. The broader message of this work is that probabilistic inference basedon a vertically integrated generative model that directly expressesgeophysical knowledge can be a much more effective approach forinterpreting scientific data than the traditional bottom-up processingpipeline.

  8. Context-aware event detection smartphone application for first responders

    NASA Astrophysics Data System (ADS)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  9. A three-tiered approach for linking pharmacokinetic considerations to the adverse outcome pathway framework for chemical-specific risk assessment

    EPA Science Inventory

    The power of the adverse outcome pathway (AOP) framework arises from its utilization of pathway-based data to describe the initial interaction of a chemical with a molecular target (molecular initiating event; (MIE), followed by a progression through a series of key events that l...

  10. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  11. Geologic controls on the formation and evolution of quaternary coastal deposits of the northern Gulf of Mexico

    USGS Publications Warehouse

    Williams, S.J.; Penland, S.; Sallenger, A.H.; McBride, R.A.; Kindlinger, J.L.

    1991-01-01

    A study of the barrier islands and wetlands in the deltaic plain of Louisiana is presented. Its purpose was to document rapid changes and to learn more about the processes responsible and the geologic framework within which they operate. It included systematic collection and analysis of precision nearshore hydrographic data, high resolution seismic profiles, surface sediment samples, continuous vibracores, digital shoreline plots, records of storm overwash events, and analysis of tide gage records to quantify the rise in relative sea level. Results from these studies demonstrate that deltaic progradation, river channel switching, and subsequent rapid erosion accompanying the marine transgression are regular and predictable events along the Mississippi River delta plain and will likely continue in the future. Mitigation measures, such as shoreline nourishment and barrier restoration, that mimic the natural processes may slow the land loss.

  12. OAE: The Ontology of Adverse Events.

    PubMed

    He, Yongqun; Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Guo, Abra; Zhang, Shelley; Jagannathan, Desikan; Toldo, Luca; Tao, Cui; Smith, Barry

    2014-01-01

    A medical intervention is a medical procedure or application intended to relieve or prevent illness or injury. Examples of medical interventions include vaccination and drug administration. After a medical intervention, adverse events (AEs) may occur which lie outside the intended consequences of the intervention. The representation and analysis of AEs are critical to the improvement of public health. The Ontology of Adverse Events (OAE), previously named Adverse Event Ontology (AEO), is a community-driven ontology developed to standardize and integrate data relating to AEs arising subsequent to medical interventions, as well as to support computer-assisted reasoning. OAE has over 3,000 terms with unique identifiers, including terms imported from existing ontologies and more than 1,800 OAE-specific terms. In OAE, the term 'adverse event' denotes a pathological bodily process in a patient that occurs after a medical intervention. Causal adverse events are defined by OAE as those events that are causal consequences of a medical intervention. OAE represents various adverse events based on patient anatomic regions and clinical outcomes, including symptoms, signs, and abnormal processes. OAE has been used in the analysis of several different sorts of vaccine and drug adverse event data. For example, using the data extracted from the Vaccine Adverse Event Reporting System (VAERS), OAE was used to analyse vaccine adverse events associated with the administrations of different types of influenza vaccines. OAE has also been used to represent and classify the vaccine adverse events cited in package inserts of FDA-licensed human vaccines in the USA. OAE is a biomedical ontology that logically defines and classifies various adverse events occurring after medical interventions. OAE has successfully been applied in several adverse event studies. The OAE ontological framework provides a platform for systematic representation and analysis of adverse events and of the factors (e.g., vaccinee age) important for determining their clinical outcomes.

  13. Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Zhan, Z.

    2017-12-01

    Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.

  14. Declarative Business Process Modelling and the Generation of ERP Systems

    NASA Astrophysics Data System (ADS)

    Schultz-Møller, Nicholas Poul; Hølmer, Christian; Hansen, Michael R.

    We present an approach to the construction of Enterprise Resource Planning (ERP) Systems, which is based on the Resources, Events and Agents (REA) ontology. This framework deals with processes involving exchange and flow of resources in a declarative, graphically-based manner describing what the major entities are rather than how they engage in computations. We show how to develop a domain-specific language on the basis of REA, and a tool which automatically can generate running web-applications. A main contribution is a proof-of-concept showing that business-domain experts can generate their own applications without worrying about implementation details.

  15. Approche du Processus d'Auto-Formation Inspiree de la Methode des "Histoires de vie" dans le cadre de la Formation continuee des institutrices maternelles en Communaute francaise de Belgique (Approach to the Process of Self-Instruction Inspired by the "Life History" Method in the Framework of the Continuing Education of Preschool Teachers in the French-Speaking Community in Belgium).

    ERIC Educational Resources Information Center

    Brandt, Madeleine

    1994-01-01

    Describes a study of the use of a critical events questionnaire and "educational biography" as means of training preschool teachers to question their role as teachers; to question their relationship with children; and to liberate their consciousness. Discusses the quantitative and qualitative results of the self-evaluation processes. (AC)

  16. A concept taxonomy and an instrument hierarchy: tools for establishing and evaluating the conceptual framework of a patient-reported outcome (PRO) instrument as applied to product labeling claims.

    PubMed

    Erickson, Pennifer; Willke, Richard; Burke, Laurie

    2009-01-01

    To facilitate development and evaluation of a PRO instrument conceptual framework, we propose two tools--a PRO concept taxonomy and a PRO instrument hierarchy. FDA's draft guidance on patient reported outcome (PRO) measures states that a clear description of the conceptual framework of an instrument is useful for evaluating its adequacy to support a treatment benefit claim for use in product labeling the draft guidance, however does not propose tools for establishing or evaluating a PRO instrument's conceptual framework. We draw from our review of PRO concepts and instruments that appear in prescription drug labeling approved in the United States from 1997 to 2007. We propose taxonomy terms that define relationships between PRO concepts, including "family,"compound concept," and "singular concept." Based on the range of complexity represented by the concepts, as defined by the taxonomy, we propose nine instrument orders for PRO measurement. The nine orders range from individual event counts to multi-item, multiscale instruments. This analysis of PRO concepts and instruments illustrates that the taxonomy and hierarchy are applicable to PRO concepts across a wide range of therapeutic areas and provide a basis for defining the instrument conceptual framework complexity. Although the utility of these tools in the drug development, review, and approval processes has not yet been demonstrated, these tools could be useful to improve communication and enhance efficiency in the instrument development and review process.

  17. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  18. A General Framework of Persistence Strategies for Biological Systems Helps Explain Domains of Life

    PubMed Central

    Yafremava, Liudmila S.; Wielgos, Monica; Thomas, Suravi; Nasir, Arshan; Wang, Minglei; Mittenthal, Jay E.; Caetano-Anollés, Gustavo

    2012-01-01

    The nature and cause of the division of organisms in superkingdoms is not fully understood. Assuming that environment shapes physiology, here we construct a novel theoretical framework that helps identify general patterns of organism persistence. This framework is based on Jacob von Uexküll’s organism-centric view of the environment and James G. Miller’s view of organisms as matter-energy-information processing molecular machines. Three concepts describe an organism’s environmental niche: scope, umwelt, and gap. Scope denotes the entirety of environmental events and conditions to which the organism is exposed during its lifetime. Umwelt encompasses an organism’s perception of these events. The gap is the organism’s blind spot, the scope that is not covered by umwelt. These concepts bring organisms of different complexity to a common ecological denominator. Ecological and physiological data suggest organisms persist using three strategies: flexibility, robustness, and economy. All organisms use umwelt information to flexibly adapt to environmental change. They implement robustness against environmental perturbations within the gap generally through redundancy and reliability of internal constituents. Both flexibility and robustness improve survival. However, they also incur metabolic matter-energy processing costs, which otherwise could have been used for growth and reproduction. Lineages evolve unique tradeoff solutions among strategies in the space of what we call “a persistence triangle.” Protein domain architecture and other evidence support the preferential use of flexibility and robustness properties. Archaea and Bacteria gravitate toward the triangle’s economy vertex, with Archaea biased toward robustness. Eukarya trade economy for survivability. Protista occupy a saddle manifold separating akaryotes from multicellular organisms. Plants and the more flexible Fungi share an economic stratum, and Metazoa are locked in a positive feedback loop toward flexibility. PMID:23443991

  19. The Five A's: what do patients want after an adverse event?

    PubMed

    Cox, Wendy

    2007-01-01

    After an adverse event, Five A's: Acknowledgment, Apology, All the Facts, Assurance and Appropriate Compensation, serve to meet the essential needs of patients and their families. This simple mnemonic creates a clear framework of understanding for the actions health professionals need to take to manage errors and adverse events in an empathic and patient-oriented fashion. While not all patients demand or need compensation, most need at least the first four A's. Patient-centered communication using this simple framework following an adverse event will foster a climate of understanding and frank discussion, addressing the emotional and physical needs of the whole patient and family.

  20. Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Halem, M.

    2012-12-01

    With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user-friendly interface that provides time-series analysis and plotting tools, geo-spacial visualization tools with interactive maps, and cause-effect inference tools. We demonstrate how we address big data challenges of monitoring, aggregating and analyzing vast amounts of social media data at a near realtime. As a result, our framework not only allows emergency responders to augment their situational awareness with social media information, but can also allow them to extract geophysical data and incorporate it into their analysis models.

  1. Using Targeting Outcomes of Programs as a Framework to Target Photographic Events in Nonformal Educational Programs

    ERIC Educational Resources Information Center

    Rockwell, S. Kay; Albrecht, Julie A.; Nugent, Gwen C.; Kunz, Gina M.

    2012-01-01

    Targeting Outcomes of Programs (TOP) is a seven-step hierarchical programming model in which the program development and performance sides are mirror images of each other. It served as a framework to identify a simple method for targeting photographic events in nonformal education programs, indicating why, when, and how photographs would be useful…

  2. Multiple electron processes of He and Ne by proton impact

    NASA Astrophysics Data System (ADS)

    Terekhin, Pavel Nikolaevich; Montenegro, Pablo; Quinto, Michele; Monti, Juan; Fojon, Omar; Rivarola, Roberto

    2016-05-01

    A detailed investigation of multiple electron processes (single and multiple ionization, single capture, transfer-ionization) of He and Ne is presented for proton impact at intermediate and high collision energies. Exclusive absolute cross sections for these processes have been obtained by calculation of transition probabilities in the independent electron and independent event models as a function of impact parameter in the framework of the continuum distorted wave-eikonal initial state theory. A binomial analysis is employed to calculate exclusive probabilities. The comparison with available theoretical and experimental results shows that exclusive probabilities are needed for a reliable description of the experimental data. The developed approach can be used for obtaining the input database for modeling multiple electron processes of charged particles passing through the matter.

  3. Properties of induced seismicity at the geothermal reservoir Insheim, Germany

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Küperkoch, Ludger; Thomas, Meier

    2017-04-01

    Within the framework of the German MAGS2 Project the processing of induced events at the geothermal power plant Insheim, Germany, has been reassessed and evaluated. The power plant is located close to the western rim of the Upper Rhine Graben in a region with a strongly heterogeneous subsurface. Therefore, the location of seismic events particularly the depth estimation is challenging. The seismic network consisting of up to 50 stations has an aperture of approximately 15 km around the power plant. Consequently, the manual processing is time consuming. Using a waveform similarity detection algorithm, the existing dataset from 2012 to 2016 has been reprocessed to complete the catalog of induced seismic events. Based on the waveform similarity clusters of similar events have been detected. Automated P- and S-arrival time determination using an improved multi-component autoregressive prediction algorithm yields approximately 14.000 P- and S-arrivals for 758 events. Applying a dataset of manual picks as reference the automated picking algorithm has been optimized resulting in a standard deviation of the residuals between automated and manual picks of about 0.02s. The automated locations show uncertainties comparable to locations of the manual reference dataset. 90 % of the automated relocations fall within the error ellipsoid of the manual locations. The remaining locations are either badly resolved due to low numbers of picks or so well resolved that the automatic location is outside the error ellipsoid although located close to the manual location. The developed automated processing scheme proved to be a useful tool to supplement real-time monitoring. The event clusters are located at small patches of faults known from reflection seismic studies. The clusters are observed close to both the injection as well as the production wells.

  4. Incorporating ecogeomorphic feedbacks to better understand resiliency in streams: A review and directions forward

    NASA Astrophysics Data System (ADS)

    Atkinson, Carla L.; Allen, Daniel C.; Davis, Lisa; Nickerson, Zachary L.

    2018-03-01

    Decades of interdisciplinary research show river form and function depends on interactions between the living and nonliving world, but a dominant paradigm underlying ecogeomorphic work consists of a top-down, unidirectional approach with abiotic forces driving biotic systems. Stream form and location within the stream network does dictate the habitat and resources available for organisms and overall community structure. Yet this traditional hierarchal framework on its own is inadequate in communicating information regarding the influence of biological systems on fluvial geomorphology that lead to changes in channel morphology, sediment cycling, and system-scale functions (e.g., sediment yield, biogeochemical nutrient cycling). Substantial evidence that organisms influence fluvial geomorphology exists, specifically the ability of aquatic vegetation and lotic animals to modify flow velocities and sediment deposition and transport - thus challenging the traditional hierarchal framework. Researchers recognize the need for ecogeomorphic frameworks that conceptualize feedbacks between organisms, sediment transport, and geomorphic structure. Furthermore, vital ecosystem processes, such as biogeochemical nutrient cycling represent the conversations that are occurring between geomorphological and biological systems. Here we review and synthesize selected case studies highlighting the role organisms play in moderating geomorphic processes and likely interact with these processes to have an impact on an essential ecosystem process, biogeochemical nutrient recycling. We explore whether biophysical interactions can provide information essential to improving predictions of system-scale river functions, specifically sediment transport and biogeochemical cycling, and discuss tools used to study these interactions. We suggest that current conceptual frameworks should acknowledge that hydrologic, geomorphologic, and ecologic processes operate on different temporal scales, generating bidirectional feedback loops over space and time. Hydro- and geomorphologic processes, operating episodically during bankfull conditions, influence ecological processes (e.g., biogeochemical cycling) occurring over longer time periods during base-flow conditions. This ecological activity generates the antecedent conditions that influence the hydro- and geomorphologic processes occurring during the next high flow event, creating a bidirectional feedback. This feedback should enhance the resiliency of fluvial landforms and ecosystem processes, allowing physical and biological processes to pull and push against each other over time.

  5. CBM First-level Event Selector Input Interface Demonstrator

    NASA Astrophysics Data System (ADS)

    Hutter, Dirk; de Cuveland, Jan; Lindenstruth, Volker

    2017-10-01

    CBM is a heavy-ion experiment at the future FAIR facility in Darmstadt, Germany. Featuring self-triggered front-end electronics and free-streaming read-out, event selection will exclusively be done by the First Level Event Selector (FLES). Designed as an HPC cluster with several hundred nodes its task is an online analysis and selection of the physics data at a total input data rate exceeding 1 TByte/s. To allow efficient event selection, the FLES performs timeslice building, which combines the data from all given input links to self-contained, potentially overlapping processing intervals and distributes them to compute nodes. Partitioning the input data streams into specialized containers allows performing this task very efficiently. The FLES Input Interface defines the linkage between the FEE and the FLES data transport framework. A custom FPGA PCIe board, the FLES Interface Board (FLIB), is used to receive data via optical links and transfer them via DMA to the host’s memory. The current prototype of the FLIB features a Kintex-7 FPGA and provides up to eight 10 GBit/s optical links. A custom FPGA design has been developed for this board. DMA transfers and data structures are optimized for subsequent timeslice building. Index tables generated by the FPGA enable fast random access to the written data containers. In addition the DMA target buffers can directly serve as InfiniBand RDMA source buffers without copying the data. The usage of POSIX shared memory for these buffers allows data access from multiple processes. An accompanying HDL module has been developed to integrate the FLES link into the front-end FPGA designs. It implements the front-end logic interface as well as the link protocol. Prototypes of all Input Interface components have been implemented and integrated into the FLES test framework. This allows the implementation and evaluation of the foreseen CBM read-out chain.

  6. A Framework for Assessing Health Risk of Environmental ...

    EPA Pesticide Factsheets

    EPA released the final report entitled, A Framework for Assessing Health Risk of Environmental Exposures to Children, which examines the impact of potential exposures during developmental lifestages and subsequent lifestages, while emphasizing the iterative nature of the analysis phase with a multidisciplinary team. Major findings and conclusions: This report outlines the framework in which mode of action(s) (MOA) can be considered across life stages. The framework is based upon existing approaches adopted in the Framework on Cumulative Risk Assessment and identifies existing guidance, guidelines and policy papers that relate to children's health risk assessment. It emphasizes the importance of an iterative approach between hazard, dose response, and exposure analyses. In addition, it includes discussion of principles for weight of evidence consideration across life stages for the hazard characterization database.Key science/assessment issues:This framework addresses the questions of why and how an improved children's health risk assessment will strengthen the overall risk assessment process across the Agency. This approach improves the scientific explanation of children's risk and will add value by: 1) providing for a more complete evaluation of the potential for vulnerability at different life stages, including a focus on the underlying biological events and critical developmental periods for incorporating MOA considerations; 2) evaluating of the potential fo

  7. National Economic Development Procedures Manual - Agricultural Flood Damage,

    DTIC Science & Technology

    1987-10-01

    based on the conceptual framework of the Economic and Environmental Principles and Guidelines for Water and Related Land Resources Implementation...the planning process and the NED evaluacion ’- ". procedures for agriculture, as described in the P&G, are thei presented. Also identified are some...ood Ioss compu t at ion approach de ’(’ op4 t hie f I ond damage for hypothetical frequency flood events and weights the result to I V- II1. + . IV-11

  8. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  9. Green Is the New Black: The Need for a New Currency That Values Water Resources in Rapidly Developing Landscapes

    NASA Astrophysics Data System (ADS)

    Creed, I. F.; Webster, K. L.; Kreutzweiser, D. P.; Beall, F.

    2014-12-01

    Canada's boreal forest supports many aquatic ecosystem services (AES) due to the intimate linkage between aquatic systems and their surrounding terrestrial watersheds in forested landscapes. There is an increasing risk to AES because natural development activities (forest management, mining, energy) have resulted in disruptions that deteriorate aquatic ecosystems at local (10s of km2) to regional (100s of km2) scales. These activities are intensifying and expanding, placing at risk the healthy aquatic ecosystems that provide AES and may threaten the continued development of the energy, forest, and mining sectors. Remarkably, we know little about the consequences of these activities on AES. The idea that AES should be explicitly integrated into modern natural resource management regulations is gaining broad acceptance. A major need is the ability to measure cumulative effects and determine thresholds (the points where aquatic ecosystems and their services cannot recover to a desired state within a reasonable time frame) in these cumulative effects. However, there is no single conceptual approach to assessing cumulative effects that is widely accepted by both scientists and managers. We present an integrated science-policy framework that enables the integration of AES into forest management risk assessment and prevention/mitigation strategies. We use this framework to explore the risk of further deterioration of AES by (1) setting risk criteria; (2) using emerging technologies to map process-based indicators representing causes and consequences of risk events to the deterioration of AES; (3) assessing existing prevention and mitigation policies in place to avoid risk events; and (4) identifying priorities for policy change needed to reduce risk event. Ultimately, the success of this framework requires that higher value be placed on AES, and in turn to improve the science and management of the boreal forest.

  10. Evacuation performance evaluation tool.

    PubMed

    Farra, Sharon; Miller, Elaine T; Gneuhs, Matthew; Timm, Nathan; Li, Gengxin; Simon, Ashley; Brady, Whittney

    2016-01-01

    Hospitals conduct evacuation exercises to improve performance during emergency events. An essential aspect in this process is the creation of reliable and valid evaluation tools. The objective of this article is to describe the development and implications of a disaster evacuation performance tool that measures one portion of the very complex process of evacuation. Through the application of the Delphi technique and DeVellis's framework, disaster and neonatal experts provided input in developing this performance evaluation tool. Following development, content validity and reliability of this tool were assessed. Large pediatric hospital and medical center in the Midwest. The tool was pilot tested with an administrative, medical, and nursing leadership group and then implemented with a group of 68 healthcare workers during a disaster exercise of a neonatal intensive care unit (NICU). The tool has demonstrated high content validity with a scale validity index of 0.979 and inter-rater reliability G coefficient (0.984, 95% CI: 0.948-0.9952). The Delphi process based on the conceptual framework of DeVellis yielded a psychometrically sound evacuation performance evaluation tool for a NICU.

  11. Factors Affecting P Loads to Surface Waters: Comparing the Roles of Precipitation and Land Management Practices

    NASA Astrophysics Data System (ADS)

    Motew, M.; Booth, E.; Carpenter, S. R.; Kucharik, C. J.

    2014-12-01

    Surface water quality is a major concern in the Yahara watershed (YW) of southern Wisconsin, home to a thriving dairy industry, the city of Madison, and five highly valued lakes that are eutrophic. Despite management interventions to mitigate runoff, there has been no significant trend in P loading to the lakes since 1975. Increases in manure production and heavy rainfall events over this time period may have offset any effects of management. We developed a comprehensive, integrated modeling framework that can simulate the effects of multiple drivers on ecosystem services, including surface water quality. The framework includes process-based representation of terrestrial ecosystems (Agro-IBIS) and groundwater flow (MODFLOW), hydrologic routing of water and nutrients across the landscape (THMB), and assessment of lake water quality (YWQM). Biogeochemical cycling and hydrologic transport of P have been added to the framework to enable detailed simulation of P dynamics within the watershed, including interactions with climate and management. The P module features in-soil cycling of organic, inorganic, and labile forms of P; manure application, decomposition, and subsequent loss of dissolved P in runoff; loss of particulate-bound P with erosion; and transport of dissolved and particulate P within waterways. Model results will compare the effects of increased heavy rainfall events, increased manure production, and implementation of best management practices on P loads to the Yahara lakes.

  12. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  13. Hierarchy Software Development Framework (h-dp-fwk) project

    NASA Astrophysics Data System (ADS)

    Zaytsev, A.

    2010-04-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  14. An iceberg model implementation in ACME.

    NASA Astrophysics Data System (ADS)

    Comeau, D.; Turner, A. K.; Hunke, E. C.

    2017-12-01

    Icebergs represent approximately half of the mass flux from the Antarctic ice sheet, transporting freshwater and nutrients away from the coast to the Southern Ocean. Icebergs impact the surrounding ocean and sea ice environment, and serve as nutrient sources for biogeochemical activity, yet these processes are typically not resolved in current climate models. We have implemented a parameterization for iceberg drift and decay into the Department of Energy's Accelerated Climate Model for Energy (ACME), where the ocean, sea ice, and land ice components are based on the unstructured grid modeling framework Multiple Prediction Across Scales (MPAS), to improve the representation of Antarctic mass flux to the Southern Ocean and its impacts on ocean stratification and circulation, sea ice, and biogeochemical processes in a fully coupled global climate model. The iceberg model is implemented in two frameworks: Lagrangian and Eulerian. The Lagrangian framework embeds individual icebergs into the ocean and sea ice grids, and will be useful in modeling `giant' (>10 nautical miles) iceberg events, which may have highly localized impacts on ocean and sea ice. The Eulerian framework allows us to model a realistic population of Antarctic icebergs without the computational expense of individual particle tracking to simulate the aggregate impact on the Southern Ocean climate system. This capability, together with under ice-shelf ocean cavities and dynamic ice-shelf fronts, will allow for extremely high fidelity simulation of the southern cryosphere within ACME.

  15. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.

  16. How to distinguish between 'business as usual' and 'significant business disruptions' and plan accordingly.

    PubMed

    Halliwell, Peter

    2008-01-01

    This paper seeks to provide an insight into Air New Zealand and how business continuity is managed in an industry with inherent disruptions. The differences between 'business as usual' and 'significant business disruptions' are outlined along with their associated criteria, response and escalation processes. The paper describes why the company incorporates the four 'R's of the Civil Defence Emergency Management Act within its BCM framework and how this aids resilience. A case study is provided that details a 'significant disruption' that occurred in November 2006. This event resulted in the total loss of a sales office and cargo shed after unrest in the Kingdom of Tonga escalated to widespread rioting, looting and destruction of their central business district. The lessons from this event have been captured and provide some essential mitigation measures that will assist in future events.

  17. Importing perceived features into false memories.

    PubMed

    Lyle, Keith B; Johnson, Marcia K

    2006-02-01

    False memories sometimes contain specific details, such as location or colour, about events that never occurred. Based on the source-monitoring framework, we investigated one process by which false memories acquire details: the reactivation and misattribution of feature information from memories of similar perceived events. In Experiments 1A and 1B, when imagined objects were falsely remembered as seen, participants often reported that the objects had appeared in locations where visually or conceptually similar objects, respectively, had actually appeared. Experiment 2 indicated that colour and shape features of seen objects were misattributed to false memories of imagined objects. Experiment 3 showed that perceived details were misattributed to false memories of objects that had not been explicitly imagined. False memories that imported perceived features, compared to those that presumably did not, were subjectively more like memories for perceived events. Thus, perception may be even more pernicious than imagination in contributing to false memories.

  18. Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform

    NASA Astrophysics Data System (ADS)

    Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George

    2016-08-01

    In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.

  19. Developing an event stratigraphy for Heinrich Event 4 at Eirik Drift, South of Greenland

    NASA Astrophysics Data System (ADS)

    Stanford, Jennifer; Abbott, Peter; Davies, Siwan

    2014-05-01

    Heinrich events are characterised in North Atlantic sediments by horizons with increased Ice Rafted Debris (IRD) concentrations, low foraminiferal abundances, and light planktonic foraminiferal calcite δ18O (meltwater dilution). They occurred quasi-periodically with a spacing of 5,000-14,000 yrs (Hemming, 2004). It is commonly believed that large iceberg/meltwater injections likely caused slowdowns of the Atlantic Meridional Overturning Circulation (AMOC). However, Stanford et al. (2011) showed, using a basin-wide reconstruction of Heinrich Event 1 (~19-15 ka BP), which was based upon marine and terrestrial records on carefully scrutinised age models, that the main iceberg discharge event occurred some ~1000 years after the initial AMOC slowdown. The study highlighted the importance of robust chronological constraints in order to permit the development of a process understanding of the evolution of such climate events, by evaluation of statistical uncertainty and robust quantification of leads and lags in the ocean-climate system. Here, we present initial results from a marine sediment core recovered from Eirik Drift, South of Greenland, that span the time period that encompasses Heinrich Event 4 (35-45 ka BP). Today, sediments on Eirik Drift are deposited and reworked by the Deep Western Boundary Current (DWBC) and are also located beneath the pathway of the East Greenland and East Greenland Coastal Currents. Hence, Eirik Drift is a crucial monitoring site of surface and deep waters that exit the Arctic via the Denmark Strait. We here combine a proxy record for North Atlantic Deep Water (NADW) flow intensity (κARM/κ) with co-registered records of surface water conditions and place these on a palaeomagnetic and tephrochronologic stratigraphic framework. Given that this chronological framework is independent of environmental influences, basin-wide signal comparison is therefore permissible. Hemming, S. R. (2004), Heinrich Events: Massive Late Pleistocene detritus layers of the North Atlantic and their global imprint, Rev. Geophys., 42, 1-43. J.D. Stanford, Rohling, E. J., Bacon, S., Roberts, A. P, Grousset, F. E. & Bolshaw, M. (2011), A new concept for the paleoceanographic evolution of Heinrich event 1 in the North Atlantic, Quaternary Science Reviews, 20, 1047-1066

  20. Demonstrating the Value of Near Real-time Satellite-based Earth Observations in a Research and Education Framework

    NASA Astrophysics Data System (ADS)

    Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.

    2017-12-01

    The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.

  1. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    PubMed Central

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-01-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3–25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension. PMID:27357966

  2. EventSlider User Manual

    DTIC Science & Technology

    2016-09-01

    is a Windows Presentation Foundation (WPF) control developed using the .NET framework in Microsoft Visual Studio. As a WPF control, it can be used in...any WPF application as a graphical visual element. The purpose of the control is to visually display time-related events as vertical lines on a...available on the control. 15. SUBJECT TERMS Windows Presentation Foundation, WPF, control, C#, .NET framework, Microsoft Visual Studio 16. SECURITY

  3. The Collaborative Heliophysics Events Knowledgebase

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Schuler, D.; Cheung, C.

    2010-12-01

    The Collaborative Heliophysics Events Knowledgebase (CHEK) leverages and integrates the existing resources developed by HEK for SDO (Hurlburt et al. 2010) to provide a collaborative framework for heliophysics researchers. This framework will enable an environment were researches can not only identify and locate relevant data, but can deploy a social network for sharing and expanding knowledge about heliophysical events. CHEK will expand the HEK and key HEK clients into the heliosphere and geospace, and create a heliophysics social network. We describe our design and goals of the CHEK project and discuss its relation to Citizen Science in the heliosphere. Hurlburt, N et al. 2010, “A Heliophysics Event Knowledgebase for Solar Dynamics Observatory,” Sol Phys., in press

  4. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    PubMed

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  5. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  6. Emotion and the prefrontal cortex: An integrative review.

    PubMed

    Dixon, Matthew L; Thiruchselvam, Ravi; Todd, Rebecca; Christoff, Kalina

    2017-10-01

    The prefrontal cortex (PFC) plays a critical role in the generation and regulation of emotion. However, we lack an integrative framework for understanding how different emotion-related functions are organized across the entire expanse of the PFC, as prior reviews have generally focused on specific emotional processes (e.g., decision making) or specific anatomical regions (e.g., orbitofrontal cortex). Additionally, psychological theories and neuroscientific investigations have proceeded largely independently because of the lack of a common framework. Here, we provide a comprehensive review of functional neuroimaging, electrophysiological, lesion, and structural connectivity studies on the emotion-related functions of 8 subregions spanning the entire PFC. We introduce the appraisal-by-content model, which provides a new framework for integrating the diverse range of empirical findings. Within this framework, appraisal serves as a unifying principle for understanding the PFC's role in emotion, while relative content-specialization serves as a differentiating principle for understanding the role of each subregion. A synthesis of data from affective, social, and cognitive neuroscience studies suggests that different PFC subregions are preferentially involved in assigning value to specific types of inputs: exteroceptive sensations, episodic memories and imagined future events, viscero-sensory signals, viscero-motor signals, actions, others' mental states (e.g., intentions), self-related information, and ongoing emotions. We discuss the implications of this integrative framework for understanding emotion regulation, value-based decision making, emotional salience, and refining theoretical models of emotion. This framework provides a unified understanding of how emotional processes are organized across PFC subregions and generates new hypotheses about the mechanisms underlying adaptive and maladaptive emotional functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. A joint modelling approach for multistate processes subject to resolution and under intermittent observations.

    PubMed

    Yiu, Sean; Tom, Brian

    2017-02-10

    Multistate processes provide a convenient framework when interest lies in characterising the transition intensities between a set of defined states. If, however, there is an unobserved event of interest (not known if and when the event occurs), which when it occurs stops future transitions in the multistate process from occurring, then drawing inference from the joint multistate and event process can be problematic. In health studies, a particular example of this could be resolution, where a resolved patient can no longer experience any further symptoms, and this is explored here for illustration. A multistate model that includes the state space of the original multistate process but partitions the state representing absent symptoms into a latent absorbing resolved state and a temporary transient state of absent symptoms is proposed. The expanded state space explicitly distinguishes between resolved and temporary spells of absent symptoms through disjoint states and allows the uncertainty of not knowing if resolution has occurred to be easily captured when constructing the likelihood; observations of absent symptoms can be considered to be temporary or having resulted from resolution. The proposed methodology is illustrated on a psoriatic arthritis data set where the outcome of interest is a set of intermittently observed disability scores. Estimated probabilities of resolving are also obtained from the model. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  8. Development and implementation of a peer-based mental health support programme for adolescents orphaned by HIV/AIDS in South Africa.

    PubMed

    Thupayagale-Tshweneagae, Gloria

    2011-12-01

    The article describes a framework and the process for the development of the peer-based mental health support programme and its implementation. The development of a peer-based mental health support programme is based on Erikson's theory on the adolescent phase of development, the psycho-educational processes; the peer approach and the orphaned adolescents lived experiences as conceptual framework. A triangulation of five qualitative methods of photography, reflective diaries, focus groups, event history calendar and field notes were used to capture the lived experiences of adolescents orphaned to HIV and AIDS. Analysis of data followed Colaizzi's method of data analysis. The combination of psycho-education, Erikson's stages of development and peer support assisted the participants to gain knowledge and skills to overcome adversity and to assist them to become to more resilient. The peer based mental health support programme if used would enhance the mental health of adolescent orphans.

  9. Building a Community Framework for Adaptation to Sea Level Rise and Inundation

    NASA Astrophysics Data System (ADS)

    Culver, M. E.; Schubel, J.; Davidson, M. A.; Haines, J.

    2010-12-01

    Sea level rise and inundation pose a substantial risk to many coastal communities, and the risk is projected to increase because of continued development, changes in the frequency and intensity of inundation events, and acceleration in the rate of sea-level rise. Calls for action at all levels acknowledge that a viable response must engage federal, state and local expertise, perspectives, and resources in a coordinated and collaborative effort. Representatives from a variety of these agencies and organizations have developed a shared framework to help coastal communities structure and facilitate community-wide adaptation processes and to help agencies determine where investments should be made to enable states and local governments to assess impacts and initiate adaptation strategies over the next decade. For sea level rise planning and implementation, the requirements for high-quality data and information are vast and the availability is limited. Participants stressed the importance of data interoperability to ensure that users are able to apply data from a variety of sources and to improve availability and confidence in the data. Participants were able to prioritize the following six categories of data needed to support future sea level rise planning and implementation: - Data to understand land forms and where and how water will flow - Monitoring data and environmental drivers - Consistent sea level rise scenarios and projections across agencies to support local planning - Data to characterize vulnerabilities and impacts of sea level rise - Community characteristics - Legal frameworks and administrative structure. To develop a meaningful and effective sea level rise adaptation plan, state and local planners must understand how the availability, scale, and uncertainty of these types of data will impact new guidelines or adaptation measures. The tools necessary to carry-out the adaptation planning process need to be understood in terms of data requirements, assumptions of the method, and the reliability and utility of the outputs. This type of information will assist the community in choosing among the available options. Communities have experience with storm and hazardous events, and the response typically is to return to pre-event conditions. With sea level rise, there will need to be a shift in perception and response from storm events, and the people must collectively arrive at a new vision for their community in light of a changing environment. Understanding the possible scenarios and the uncertainty or probability of those scenarios is a critical component in the adaptation process. Although there is a broad constituency that does not know the issue well, many communities are savvy about the impacts of sea level rise. Armed with the available information and resources and an understanding of the uncertainties, many communities are ready to take action. Successful adaptation planning will require that all sectors — local, state, federal, academic, nongovernmental, and the private sector — work together throughout the process to provide local communities with resources, scientific and political support.

  10. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system

    PubMed Central

    2010-01-01

    Background The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the “in silico” stochastic event based modeling approach to find the molecular dynamics of the system. Results In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Conclusions Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics. PMID:21143785

  11. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system.

    PubMed

    Ghosh, Preetam; Ghosh, Samik; Basu, Kalyan; Das, Sajal K; Zhang, Chaoyang

    2010-12-01

    The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the "in silico" stochastic event based modeling approach to find the molecular dynamics of the system. In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics.

  12. Dual-stage periodic event-triggered output-feedback control for linear systems.

    PubMed

    Ruan, Zhen; Chen, Wu-Hua; Lu, Xiaomei

    2018-05-01

    This paper proposes an event-triggered control framework, called dual-stage periodic event-triggered control (DSPETC), which unifies periodic event-triggered control (PETC) and switching event-triggered control (SETC). Specifically, two period parameters h 1 and h 2 are introduced to characterize the new event-triggering rule, where h 1 denotes the sampling period, while h 2 denotes the monitoring period. By choosing some specified values of h 2 , the proposed control scheme can reduce to PETC or SETC scheme. In the DSPETC framework, the controlled system is represented as a switched system model and its stability is analyzed via a switching-time-dependent Lyapunov functional. Both the cases with/without network-induced delays are investigated. Simulation and experimental results show that the DSPETC scheme is superior to the PETC scheme and the SETC scheme. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    NASA Astrophysics Data System (ADS)

    Wynne, Ben; ATLAS Collaboration

    2017-10-01

    We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent execution of algorithms within an event. This has the potential to significantly reduce the memory footprint on future manycore devices. An additional benefit of the HLT implementation within AthenaMT is that it facilitates the integration of offline code into the HLT. The trigger must retain high rejection in the face of increasing numbers of pileup collisions. This will be achieved by greater use of offline algorithms that are designed to maximize the discrimination of signal from background. Therefore a unification of the HLT and offline reconstruction software environment is required. This has been achieved while at the same time retaining important HLT-specific optimisations that minimize the computation performed to reach a trigger decision. Such optimizations include early event rejection and reconstruction within restricted geometrical regions. We report on an HLT prototype in which the need for HLT-specific components has been reduced to a minimum. Promising results have been obtained with a prototype that includes the key elements of trigger functionality including regional reconstruction and early event rejection. We report on the first experience of migrating trigger selections to this new framework and present the next steps towards a full implementation of the ATLAS trigger.

  14. Systems Reliability Framework for Surface Water Sustainability and Risk Management

    NASA Astrophysics Data System (ADS)

    Myers, J. R.; Yeghiazarian, L.

    2016-12-01

    With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability. With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this framework will significantly improve the efficiency and precision of sustainable watershed management strategies through providing a better understanding of how watershed characteristics and environmental parameters affect surface water quality and sustainability.

  15. New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences

    NASA Astrophysics Data System (ADS)

    Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro

    2017-04-01

    Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.

  16. A database de-identification framework to enable direct queries on medical data for secondary use.

    PubMed

    Erdal, B S; Liu, J; Ding, J; Chen, J; Marsh, C B; Kamal, J; Clymer, B D

    2012-01-01

    To qualify the use of patient clinical records as non-human-subject for research purpose, electronic medical record data must be de-identified so there is minimum risk to protected health information exposure. This study demonstrated a robust framework for structured data de-identification that can be applied to any relational data source that needs to be de-identified. Using a real world clinical data warehouse, a pilot implementation of limited subject areas were used to demonstrate and evaluate this new de-identification process. Query results and performances are compared between source and target system to validate data accuracy and usability. The combination of hashing, pseudonyms, and session dependent randomizer provides a rigorous de-identification framework to guard against 1) source identifier exposure; 2) internal data analyst manually linking to source identifiers; and 3) identifier cross-link among different researchers or multiple query sessions by the same researcher. In addition, a query rejection option is provided to refuse queries resulting in less than preset numbers of subjects and total records to prevent users from accidental subject identification due to low volume of data. This framework does not prevent subject re-identification based on prior knowledge and sequence of events. Also, it does not deal with medical free text de-identification, although text de-identification using natural language processing can be included due its modular design. We demonstrated a framework resulting in HIPAA Compliant databases that can be directly queried by researchers. This technique can be augmented to facilitate inter-institutional research data sharing through existing middleware such as caGrid.

  17. CAVIAR: a 45k neuron, 5M synapse, 12G connects/s AER hardware sensory-processing- learning-actuating system for high-speed visual object recognition and tracking.

    PubMed

    Serrano-Gotarredona, Rafael; Oster, Matthias; Lichtsteiner, Patrick; Linares-Barranco, Alejandro; Paz-Vicente, Rafael; Gomez-Rodriguez, Francisco; Camunas-Mesa, Luis; Berner, Raphael; Rivas-Perez, Manuel; Delbruck, Tobi; Liu, Shih-Chii; Douglas, Rodney; Hafliger, Philipp; Jimenez-Moreno, Gabriel; Civit Ballcels, Anton; Serrano-Gotarredona, Teresa; Acosta-Jimenez, Antonio J; Linares-Barranco, Bernabé

    2009-09-01

    This paper describes CAVIAR, a massively parallel hardware implementation of a spike-based sensing-processing-learning-actuating system inspired by the physiology of the nervous system. CAVIAR uses the asychronous address-event representation (AER) communication framework and was developed in the context of a European Union funded project. It has four custom mixed-signal AER chips, five custom digital AER interface components, 45k neurons (spiking cells), up to 5M synapses, performs 12G synaptic operations per second, and achieves millisecond object recognition and tracking latencies.

  18. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  19. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes.

  20. Making Sense in the Edge of Chaos: A Framework for Effective Initial Response Efforts to Large-Scale Incidents

    DTIC Science & Technology

    2010-09-01

    working with equally experienced partners who can, cumulatively, help each other make sense of chaotic situations. “Human brains collect, organize...but a process reinforced by years of Fire Department training. No matter what we do, even an optimally functioning human brain will prepare for...trick or reorganize the brain of those who will be first responding incident commanders to an edge-of-chaos event into creatively making sense of

  1. Motivation, cognition and pseudoscience.

    PubMed

    Lindeman, M

    1998-12-01

    The article proposes a framework that views pseudoscientific beliefs as a joint function of the basic social motives and the default way of processing everyday information. The interplay between the basic motives and experiential thinking is illustrated with three examples. The first concerns comprehension of self via astrology and graphology, and the second involves the comprehension of unexpected events (one domain of the motive to comprehend the world). The last example describes health control by alternative medicine, as a modern way of controlling future outcomes.

  2. The role of ensemble post-processing for modeling the ensemble tail

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    The past decades the numerical weather prediction community has witnessed a paradigm shift from deterministic to probabilistic forecast and state estimation (Buizza and Leutbecher, 2015; Buizza et al., 2008), in an attempt to quantify the uncertainties associated with initial-condition and model errors. An important benefit of a probabilistic framework is the improved prediction of extreme events. However, one may ask to what extent such model estimates contain information on the occurrence probability of extreme events and how this information can be optimally extracted. Different approaches have been proposed and applied on real-world systems which, based on extreme value theory, allow the estimation of extreme-event probabilities conditional on forecasts and state estimates (Ferro, 2007; Friederichs, 2010). Using ensemble predictions generated with a model of low dimensionality, a thorough investigation is presented quantifying the change of predictability of extreme events associated with ensemble post-processing and other influencing factors including the finite ensemble size, lead time and model assumption and the use of different covariates (ensemble mean, maximum, spread...) for modeling the tail distribution. Tail modeling is performed by deriving extreme-quantile estimates using peak-over-threshold representation (generalized Pareto distribution) or quantile regression. Common ensemble post-processing methods aim to improve mostly the ensemble mean and spread of a raw forecast (Van Schaeybroeck and Vannitsem, 2015). Conditional tail modeling, on the other hand, is a post-processing in itself, focusing on the tails only. Therefore, it is unclear how applying ensemble post-processing prior to conditional tail modeling impacts the skill of extreme-event predictions. This work is investigating this question in details. Buizza, Leutbecher, and Isaksen, 2008: Potential use of an ensemble of analyses in the ECMWF Ensemble Prediction System, Q. J. R. Meteorol. Soc. 134: 2051-2066.Buizza and Leutbecher, 2015: The forecast skill horizon, Q. J. R. Meteorol. Soc. 141: 3366-3382.Ferro, 2007: A probability model for verifying deterministic forecasts of extreme events. Weather and Forecasting 22 (5), 1089-1100.Friederichs, 2010: Statistical downscaling of extreme precipitation events using extreme value theory. Extremes 13, 109-132.Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  3. A Rich Client-Server Based Framework for Convenient Security and Management of Mobile Applications

    NASA Astrophysics Data System (ADS)

    Badan, Stephen; Probst, Julien; Jaton, Markus; Vionnet, Damien; Wagen, Jean-Frédéric; Litzistorf, Gérald

    Contact lists, Emails, SMS or custom applications on a professional smartphone could hold very confidential or sensitive information. What could happen in case of theft or accidental loss of such devices? Such events could be detected by the separation between the smartphone and a Bluetooth companion device. This event should typically block the applications and delete personal and sensitive data. Here, a solution is proposed based on a secured framework application running on the mobile phone as a rich client connected to a security server. The framework offers strong and customizable authentication and secured connectivity. A security server manages all security issues. User applications are then loaded via the framework. User data can be secured, synchronized, pushed or pulled via the framework. This contribution proposes a convenient although secured environment based on a client-server architecture using external authentications. Several features of the proposed system are exposed and a practical demonstrator is described.

  4. Mining Multi-Aspect Reflection of News Events in Twitter: Discovery, Linking and Presentation

    PubMed Central

    Wang, Jingjing; Tong, Wenzhu; Yu, Hongkun; Li, Min; Ma, Xiuli; Cai, Haoyan; Hanratty, Tim; Han, Jiawei

    2015-01-01

    A major event often has repercussions on both news media and microblogging sites such as Twitter. Reports from mainstream news agencies and discussions from Twitter complement each other to form a complete picture. An event can have multiple aspects (sub-events) describing it from multiple angles, each of which attracts opinions/comments posted on Twitter. Mining such reflections is interesting to both policy makers and ordinary people seeking information. In this paper, we propose a unified framework to mine multi-aspect reflections of news events in Twitter. We propose a novel and efficient dynamic hierarchical entity-aware event discovery model to learn news events and their multiple aspects. The aspects of an event are linked to their reflections in Twitter by a bootstrapped dataless classification scheme, which elegantly handles the challenges of selecting informative tweets under overwhelming noise and bridging the vocabularies of news and tweets. In addition, we demonstrate that our framework naturally generates an informative presentation of each event with entity graphs, time spans, news summaries and tweet highlights to facilitate user digestion. PMID:27034625

  5. Mining Multi-Aspect Reflection of News Events in Twitter: Discovery, Linking and Presentation.

    PubMed

    Wang, Jingjing; Tong, Wenzhu; Yu, Hongkun; Li, Min; Ma, Xiuli; Cai, Haoyan; Hanratty, Tim; Han, Jiawei

    2015-11-01

    A major event often has repercussions on both news media and microblogging sites such as Twitter. Reports from mainstream news agencies and discussions from Twitter complement each other to form a complete picture. An event can have multiple aspects (sub-events) describing it from multiple angles, each of which attracts opinions/comments posted on Twitter. Mining such reflections is interesting to both policy makers and ordinary people seeking information. In this paper, we propose a unified framework to mine multi-aspect reflections of news events in Twitter. We propose a novel and efficient dynamic hierarchical entity-aware event discovery model to learn news events and their multiple aspects. The aspects of an event are linked to their reflections in Twitter by a bootstrapped dataless classification scheme, which elegantly handles the challenges of selecting informative tweets under overwhelming noise and bridging the vocabularies of news and tweets. In addition, we demonstrate that our framework naturally generates an informative presentation of each event with entity graphs, time spans, news summaries and tweet highlights to facilitate user digestion.

  6. A Framework to Understand Extreme Space Weather Event Probability.

    PubMed

    Jonas, Seth; Fronczyk, Kassandra; Pratt, Lucas M

    2018-03-12

    An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well-being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments. © 2018 Society for Risk Analysis.

  7. Hierarchical Event Descriptors (HED): Semi-Structured Tagging for Real-World Events in Large-Scale EEG

    PubMed Central

    Bigdely-Shamlo, Nima; Cockfield, Jeremy; Makeig, Scott; Rognon, Thomas; La Valle, Chris; Miyakoshi, Makoto; Robbins, Kay A.

    2016-01-01

    Real-world brain imaging by EEG requires accurate annotation of complex subject-environment interactions in event-rich tasks and paradigms. This paper describes the evolution of the Hierarchical Event Descriptor (HED) system for systematically describing both laboratory and real-world events. HED version 2, first described here, provides the semantic capability of describing a variety of subject and environmental states. HED descriptions can include stimulus presentation events on screen or in virtual worlds, experimental or spontaneous events occurring in the real world environment, and events experienced via one or multiple sensory modalities. Furthermore, HED 2 can distinguish between the mere presence of an object and its actual (or putative) perception by a subject. Although the HED framework has implicit ontological and linked data representations, the user-interface for HED annotation is more intuitive than traditional ontological annotation. We believe that hiding the formal representations allows for a more user-friendly interface, making consistent, detailed tagging of experimental, and real-world events possible for research users. HED is extensible while retaining the advantages of having an enforced common core vocabulary. We have developed a collection of tools to support HED tag assignment and validation; these are available at hedtags.org. A plug-in for EEGLAB (sccn.ucsd.edu/eeglab), CTAGGER, is also available to speed the process of tagging existing studies. PMID:27799907

  8. Post-event Processing Predicts Impaired Cortisol Recovery Following Social Stressor: The Moderating Role of Social Anxiety.

    PubMed

    Maeda, Shunta; Sato, Tomoya; Shimada, Hironori; Tsumura, Hideki

    2017-01-01

    There is growing evidence that individuals with social anxiety show impaired cortisol recovery after experiencing social evaluative stressors. Yet, little is known regarding the cognitive processes underlying such impaired cortisol recovery. The present study examined the effect of post-event processing (PEP), referred to as repetitive thinking about social situations, on cortisol recovery following a social stressor. Forty-two non-clinical university students (23 women, 19 men, mean age = 22.0 ± 2.0 years) completed the Trier Social Stress Test (TSST), followed by a thought sampling procedure which assessed the frequency of PEP reflecting the TSST. A growth curve model showed PEP and social anxiety interactively predicted cortisol recovery. In particular, PEP predicted impaired cortisol recovery in those with low levels of social anxiety but not in those with high levels of social anxiety, which contradicted the initial hypothesis. These findings suggest that PEP is differentially associated with cortisol recovery depending on levels of social anxiety. The possible mechanisms underlying these findings were discussed in terms of protective inhibition framework.

  9. Auditory Scene Analysis: An Attention Perspective

    PubMed Central

    2017-01-01

    Purpose This review article provides a new perspective on the role of attention in auditory scene analysis. Method A framework for understanding how attention interacts with stimulus-driven processes to facilitate task goals is presented. Previously reported data obtained through behavioral and electrophysiological measures in adults with normal hearing are summarized to demonstrate attention effects on auditory perception—from passive processes that organize unattended input to attention effects that act at different levels of the system. Data will show that attention can sharpen stream organization toward behavioral goals, identify auditory events obscured by noise, and limit passive processing capacity. Conclusions A model of attention is provided that illustrates how the auditory system performs multilevel analyses that involve interactions between stimulus-driven input and top-down processes. Overall, these studies show that (a) stream segregation occurs automatically and sets the basis for auditory event formation; (b) attention interacts with automatic processing to facilitate task goals; and (c) information about unattended sounds is not lost when selecting one organization over another. Our results support a neural model that allows multiple sound organizations to be held in memory and accessed simultaneously through a balance of automatic and task-specific processes, allowing flexibility for navigating noisy environments with competing sound sources. Presentation Video http://cred.pubs.asha.org/article.aspx?articleid=2601618 PMID:29049599

  10. Marketing in nursing organizations.

    PubMed

    Chambers, S B

    1989-05-01

    The purpose of chapter 3 is to provide a conceptual framework for understanding marketing. Although it is often considered to be, marketing is not really a new activity for nursing organizations. What is perhaps new to most nursing organizations is the conduct of marketing activities as a series of interrelated events that are part of a strategic marketing process. The increasingly volatile nursing environment requires a comprehensive approach to marketing. This chapter presents definitions of marketing, the marketing mix, the characteristics of nonprofit marketing, the relationship of strategic planning and strategic marketing, portfolio analysis, and a detailed description of the strategic marketing process. While this chapter focuses on marketing concepts, essential components, and presentation of the strategic marketing process, chapter 4 presents specific methods and techniques for implementing the strategic marketing process.

  11. Engineering risk assessment for emergency disposal projects of sudden water pollution incidents.

    PubMed

    Shi, Bin; Jiang, Jiping; Liu, Rentao; Khan, Afed Ullah; Wang, Peng

    2017-06-01

    Without an engineering risk assessment for emergency disposal in response to sudden water pollution incidents, responders are prone to be challenged during emergency decision making. To address this gap, the concept and framework of emergency disposal engineering risks are reported in this paper. The proposed risk index system covers three stages consistent with the progress of an emergency disposal project. Fuzzy fault tree analysis (FFTA), a logical and diagrammatic method, was developed to evaluate the potential failure during the process of emergency disposal. The probability of basic events and their combination, which caused the failure of an emergency disposal project, were calculated based on the case of an emergency disposal project of an aniline pollution incident in the Zhuozhang River, Changzhi, China, in 2014. The critical events that can cause the occurrence of a top event (TE) were identified according to their contribution. Finally, advices on how to take measures using limited resources to prevent the failure of a TE are given according to the quantified results of risk magnitude. The proposed approach could be a potential useful safeguard for the implementation of an emergency disposal project during the process of emergency response.

  12. Sensemaking of patient safety risks and hazards.

    PubMed

    Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S

    2006-08-01

    In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced.

  13. Sensemaking of Patient Safety Risks and Hazards

    PubMed Central

    Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S

    2006-01-01

    In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced. PMID:16898979

  14. A Bayesian analysis of the 2016 Pedernales (Ecuador) earthquake rupture process

    NASA Astrophysics Data System (ADS)

    Gombert, B.; Duputel, Z.; Jolivet, R.; Rivera, L. A.; Simons, M.; Jiang, J.; Liang, C.; Fielding, E. J.

    2017-12-01

    The 2016 Mw = 7.8 Pedernales earthquake is the largest event to strike Ecuador since 1979. Long period W-phase and Global CMT solutions suggest that slip is not perpendicular to the trench axis, in agreement with the convergence obliquity of the Ecuadorian subduction. In this study, we propose a new co-seismic kinematic slip model obtained from the joint inversion of multiple observations in an unregularized and fully Bayesian framework. We use a comprehensive static dataset composed of several InSAR scenes, GPS static offsets, and tsunami waveforms from two nearby DART stations. The kinematic component of the rupture process is constrained by an extensive network of High-Rate GPS and accelerometers. Our solution includes the ensemble of all plausible models that are consistent with our prior information and fit the available observations within data and prediction uncertainties. We analyse the source process in light of the historical seismicity, in particular the Mw = 7.8 1942 earthquake for which the rupture extent overlaps with the 2016 event. In addition, we conduct a probabilistic comparison of co-seismic slip with a stochastic interseismic coupling model obtained from GPS data, putting a light on the processes at play within the Ecuadorian subduction margin.

  15. DEPENDENCE OF THE Sr-TO-Ba AND Sr-TO-Eu RATIO ON THE NUCLEAR EQUATION OF STATE IN METAL-POOR HALO STARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Famiano, M. A.; Kajino, T.; Aoki, W.

    A model is proposed in which the dependence on the equation of state (EOS) of the scatter of [Sr/Ba] in metal-poor stars is studied. Light r-process element enrichment in these stars has been explained via a truncated r-process, or “tr-process.” The truncation of the r-process from a generic core-collapse event followed by a collapse into an accretion-induced black hole is examined in the framework of a galactic chemical evolution model. The constraints on this model imposed by observations of extremely metal-poor stars are explained, and the upper limits in the [Sr/Ba] distributions are found to be related to the nuclearmore » EOS in a collapse scenario. The scatter in [Sr/Ba] and [Sr/Eu] as a function of metallicity has been found to be consistent with turbulent ejection in core-collapse supernovae. Adaptations of this model are evaluated to account for the scatter in isotopic observables. This is done by assuming mixing in ejecta in a supernova event. Stiff EOS are eliminated by this model.« less

  16. Conditioning from an information processing perspective.

    PubMed

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  17. Concise biomarker for spatial-temporal change in three-dimensional ultrasound measurement of carotid vessel wall and plaque thickness based on a graph-based random walk framework: Towards sensitive evaluation of response to therapy.

    PubMed

    Chiu, Bernard; Chen, Weifu; Cheng, Jieyu

    2016-12-01

    Rapid progression in total plaque area and volume measured from ultrasound images has been shown to be associated with an elevated risk of cardiovascular events. Since atherosclerosis is focal and predominantly occurring at the bifurcation, biomarkers that are able to quantify the spatial distribution of vessel-wall-plus-plaque thickness (VWT) change may allow for more sensitive detection of treatment effect. The goal of this paper is to develop simple and sensitive biomarkers to quantify the responsiveness to therapies based on the spatial distribution of VWT-Change on the entire 2D carotid standardized map previously described. Point-wise VWT-Changes computed for each patient were reordered lexicographically to a high-dimensional data node in a graph. A graph-based random walk framework was applied with the novel Weighted Cosine (WCos) similarity function introduced, which was tailored for quantification of responsiveness to therapy. The converging probability of each data node to the VWT regression template in the random walk process served as a scalar descriptor for VWT responsiveness to treatment. The WCos-based biomarker was 14 times more sensitive than the mean VWT-Change in discriminating responsive and unresponsive subjects based on the p-values obtained in T-tests. The proposed framework was extended to quantify where VWT-Change occurred by including multiple VWT-Change distribution templates representing focal changes at different regions. Experimental results show that the framework was effective in classifying carotid arteries with focal VWT-Change at different locations and may facilitate future investigations to correlate risk of cardiovascular events with the location where focal VWT-Change occurs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A retrospective streamflow ensemble forecast for an extreme hydrologic event: a case study of Hurricane Irene and on the Hudson River basin

    NASA Astrophysics Data System (ADS)

    Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie

    2016-07-01

    This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.

  19. Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data

    PubMed Central

    Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos

    2015-01-01

    Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993

  20. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    PubMed

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  1. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    PubMed

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  2. Cyber Security Research Frameworks For Coevolutionary Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rush, George D.; Tauritz, Daniel Remy

    Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger,more » more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.« less

  3. Origin and dynamics of depositionary subduction margins

    USGS Publications Warehouse

    Vannucchi, Paola; Morgan, Jason P.; Silver, Eli; Kluesner, Jared W.

    2016-01-01

    Here we propose a new framework for forearc evolution that focuses on the potential feedbacks between subduction tectonics, sedimentation, and geomorphology that take place during an extreme event of subduction erosion. These feedbacks can lead to the creation of a “depositionary forearc,” a forearc structure that extends the traditional division of forearcs into accretionary or erosive subduction margins by demonstrating a mode of rapid basin accretion during an erosive event at a subduction margin. A depositionary mode of forearc evolution occurs when terrigenous sediments are deposited directly on the forearc while it is being removed from below by subduction erosion. In the most extreme case, an entire forearc can be removed by a single subduction erosion event followed by depositionary replacement without involving transfer of sediments from the incoming plate. We need to further recognize that subduction forearcs are often shaped by interactions between slow, long-term processes, and sudden extreme events reflecting the sudden influences of large-scale morphological variations in the incoming plate. Both types of processes contribute to the large-scale architecture of the forearc, with extreme events associated with a replacive depositionary mode that rapidly creates sections of a typical forearc margin. The persistent upward diversion of the megathrust is likely to affect its geometry, frictional nature, and hydrogeology. Therefore, the stresses along the fault and individual earthquake rupture characteristics are also expected to be more variable in these erosive systems than in systems with long-lived megathrust surfaces.

  4. Repetitive deliberate fires: Development and validation of a methodology to detect series.

    PubMed

    Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi

    2017-08-01

    The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  5. Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method

    DOE PAGES

    Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan

    2015-07-29

    Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less

  6. A dynamical pattern recognition model of gamma activity in auditory cortex

    PubMed Central

    Zavaglia, M.; Canolty, R.T.; Schofield, T.M.; Leff, A.P.; Ursino, M.; Knight, R.T.; Penny, W.D.

    2012-01-01

    This paper describes a dynamical process which serves both as a model of temporal pattern recognition in the brain and as a forward model of neuroimaging data. This process is considered at two separate levels of analysis: the algorithmic and implementation levels. At an algorithmic level, recognition is based on the use of Occurrence Time features. Using a speech digit database we show that for noisy recognition environments, these features rival standard cepstral coefficient features. At an implementation level, the model is defined using a Weakly Coupled Oscillator (WCO) framework and uses a transient synchronization mechanism to signal a recognition event. In a second set of experiments, we use the strength of the synchronization event to predict the high gamma (75–150 Hz) activity produced by the brain in response to word versus non-word stimuli. Quantitative model fits allow us to make inferences about parameters governing pattern recognition dynamics in the brain. PMID:22327049

  7. A marked point process approach for identifying neural correlates of tics in Tourette Syndrome.

    PubMed

    Loza, Carlos A; Shute, Jonathan B; Principe, Jose C; Okun, Michael S; Gunduz, Aysegul

    2017-07-01

    We propose a novel interpretation of local field potentials (LFP) based on a marked point process (MPP) framework that models relevant neuromodulations as shifted weighted versions of prototypical temporal patterns. Particularly, the MPP samples are categorized according to the well known oscillatory rhythms of the brain in an effort to elucidate spectrally specific behavioral correlates. The result is a transient model for LFP. We exploit data-driven techniques to fully estimate the model parameters with the added feature of exceptional temporal resolution of the resulting events. We utilize the learned features in the alpha and beta bands to assess correlations to tic events in patients with Tourette Syndrome (TS). The final results show stronger coupling between LFP recorded from the centromedian-paraficicular complex of the thalamus and the tic marks, in comparison to electrocorticogram (ECoG) recordings from the hand area of the primary motor cortex (M1) in terms of the area under the curve (AUC) of the receiver operating characteristic (ROC) curve.

  8. The Superstatistical Nature and Interoccurrence Time of Atmospheric Mercury Concentration Fluctuations

    NASA Astrophysics Data System (ADS)

    Carbone, F.; Bruno, A. G.; Naccarato, A.; De Simone, F.; Gencarelli, C. N.; Sprovieri, F.; Hedgecock, I. M.; Landis, M. S.; Skov, H.; Pfaffhuber, K. A.; Read, K. A.; Martin, L.; Angot, H.; Dommergue, A.; Magand, O.; Pirrone, N.

    2018-01-01

    The probability density function (PDF) of the time intervals between subsequent extreme events in atmospheric Hg0 concentration data series from different latitudes has been investigated. The Hg0 dynamic possesses a long-term memory autocorrelation function. Above a fixed threshold Q in the data, the PDFs of the interoccurrence time of the Hg0 data are well described by a Tsallis q-exponential function. This PDF behavior has been explained in the framework of superstatistics, where the competition between multiple mesoscopic processes affects the macroscopic dynamics. An extensive parameter μ, encompassing all possible fluctuations related to mesoscopic phenomena, has been identified. It follows a χ2 distribution, indicative of the superstatistical nature of the overall process. Shuffling the data series destroys the long-term memory, the distributions become independent of Q, and the PDFs collapse on to the same exponential distribution. The possible central role of atmospheric turbulence on extreme events in the Hg0 data is highlighted.

  9. U.S. History Framework for the 2010 National Assessment of Educational Progress

    ERIC Educational Resources Information Center

    National Assessment Governing Board, 2009

    2009-01-01

    This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…

  10. Developing Brain Vital Signs: Initial Framework for Monitoring Brain Function Changes Over Time

    PubMed Central

    Ghosh Hajra, Sujoy; Liu, Careesa C.; Song, Xiaowei; Fickling, Shaun; Liu, Luke E.; Pawlowski, Gabriela; Jorgensen, Janelle K.; Smith, Aynsley M.; Schnaider-Beeri, Michal; Van Den Broek, Rudi; Rizzotti, Rowena; Fisher, Kirk; D'Arcy, Ryan C. N.

    2016-01-01

    Clinical assessment of brain function relies heavily on indirect behavior-based tests. Unfortunately, behavior-based assessments are subjective and therefore susceptible to several confounding factors. Event-related brain potentials (ERPs), derived from electroencephalography (EEG), are often used to provide objective, physiological measures of brain function. Historically, ERPs have been characterized extensively within research settings, with limited but growing clinical applications. Over the past 20 years, we have developed clinical ERP applications for the evaluation of functional status following serious injury and/or disease. This work has identified an important gap: the need for a clinically accessible framework to evaluate ERP measures. Crucially, this enables baseline measures before brain dysfunction occurs, and might enable the routine collection of brain function metrics in the future much like blood pressure measures today. Here, we propose such a framework for extracting specific ERPs as potential “brain vital signs.” This framework enabled the translation/transformation of complex ERP data into accessible metrics of brain function for wider clinical utilization. To formalize the framework, three essential ERPs were selected as initial indicators: (1) the auditory N100 (Auditory sensation); (2) the auditory oddball P300 (Basic attention); and (3) the auditory speech processing N400 (Cognitive processing). First step validation was conducted on healthy younger and older adults (age range: 22–82 years). Results confirmed specific ERPs at the individual level (86.81–98.96%), verified predictable age-related differences (P300 latency delays in older adults, p < 0.05), and demonstrated successful linear transformation into the proposed brain vital sign (BVS) framework (basic attention latency sub-component of BVS framework reflects delays in older adults, p < 0.05). The findings represent an initial critical step in developing, extracting, and characterizing ERPs as vital signs, critical for subsequent evaluation of dysfunction in conditions like concussion and/or dementia. PMID:27242415

  11. The impact of humanitarian emergencies on the prevalence of violence against children: an evidence-based ecological framework.

    PubMed

    Rubenstein, Beth L; Stark, Lindsay

    2017-03-01

    Little is known about the patterns and mechanisms by which humanitarian emergencies may exacerbate violence against children. In this article, we propose using the ecological framework to examine the impact of humanitarian emergencies on interpersonal violence against children. We consider the literature that supports this framework and suggest future directions for research to fill identified gaps in the framework. The relationship between humanitarian emergencies and violence against children depends on risk factors at multiple levels, including a breakdown of child protection systems, displacement, threats to livelihoods, changing gender roles, changing household composition, overcrowded living conditions, early marriage, exposure to conflict or other emergency events, and alcohol abuse. The empirical evidence supporting the proposed emergency/violence framework is limited by cross-sectional study designs and a propensity to predominantly examine individual-level determinants of violence, especially exposure to conflict or emergency events. Thus, there is a pressing need to contextualize the relationship between conflict or emergency events and violence against children within the wider ecological and household dynamics that occur during humanitarian emergencies. Ultimately, this will require longitudinal observations of children, families and communities from before the emergency through recovery and improvements to ongoing global surveillance systems. More complete data will enable the humanitarian community to design effective, appropriate and well-targeted interventions.

  12. Communication and flood risk awareness in the framework of DRIHM project

    NASA Astrophysics Data System (ADS)

    Llasat, Maria-Carmen; Llasat-Botija, Montserrat; Gilabert, Joan; Marcos, Raül; Parodi, Antonio; Rebora, Nicola; Garrote, Luís

    2014-05-01

    One of the main objectives of the Hyogo Framework for Action 2005-2015 of the United Nations is to increase public awareness so as to understand the risks, vulnerabilities and disaster reduction globally. In the case of floods they are a major hazard in Spain. In the last 30 years alone, more than 300 flood and flash-flood events have been recorded. Usually these events produce minor damages and, occasionally, some deaths, usually due to imprudent behavior. In this context, improvements in the forecast and warning systems, the communication process and for the population to have a better knowledge using new technologies are welcome. The starting point of this communication is the analysis of the treatment of flood events made by the press, the risk perception of the population, as well as the communication tools and protocols of Civil Protection and Water Catalan Agency (ACA) in Catalonia (NE of Iberian Peninsula). Afterwards, the analysis of the application of new tools developed by the University of Barcelona, with specific emphasis on the collaboration with the population, is shown. La Rambla is an informative portal of flood prevention, where share knowledge and experiences with the population. It is also a historical flood site where everyone can contribute and participate by sending experiences, data, records, pictures and much more. In La Rambla we can find information such as flood prevention plans, acts, scientific vocabulary ... There are also sections on historical floods, photo galleries, quizzes, flood news, and much more. The blog will be also used as a platform to distribute post-event questionnaires in order to analyze social impact as well as the population behavior when faced with a flood. Besides this, social networks are some of the most important channels where warnings and flood risk situations can be communicated. In the case of Facebook and Twitter, we use the platforms as a warning channel, to have a simple monitoring of the event and introducing some explanations to understand the situation, as well as to recommend scientific lectures or show new achievements. This work has been developed in the framework of the "FP7 DRIHM (Distributed Research Infrastructure for Hydro-Meteorology, www.drihm.eu) project that intends to develop a prototype e-Science environment to facilitate this collaboration and provide end-to-end hydrometeorological services (models, datasets and post-processing tools) at the European level, with the ability to expand to global scale. The objectives of DRIHM are to lead the definition of a common long-term strategy, to foster the development of new HMR models and observational archives for the study of severe hydrometeorological events, to promote the execution and analysis of high-end simulations, and to support the dissemination of predictive models as decision analysis tools. The project also aims to give students and professionals some tools to simulate flood events by combining different meteorological models with different hydrological models. Some of the cases of study are also used as an example for the communication tools, which includes, besides those previously showed, a newsletter and some videos.

  13. Reactive system verification case study: Fault-tolerant transputer communication

    NASA Technical Reports Server (NTRS)

    Crane, D. Francis; Hamory, Philip J.

    1993-01-01

    A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.

  14. Ensuring the inclusion of sexual and reproductive health and rights under a sustainable development goal on health in the post-2015 human rights framework for development.

    PubMed

    Haslegrave, Marianne

    2013-11-01

    Since the 1994 International Conference on Population and Development (ICPD) in Cairo placed reproductive health and rights firmly on the international agenda, civil society and other advocates have worked ceaselessly to ensure that they remain central to women's empowerment and have taken all opportunities to expand the framework to include sexual health and rights. When the development process changed with the introduction of the Millennium Development Goals (MDGs) in 2000, sexual and reproductive health and rights were excluded, and only in 2007 was universal access to reproductive health added back in. In 2014 and 2015, the future of ICPD Beyond 2014, the MDGs and the post-2015 development framework will be decided, following consultations and meetings across the globe. This paper takes stock of the key influences on efforts to achieve the ICPD agenda and summarises the past, current and planned future events, reports and processes between 1994 and 2014, leading up to the determination of the post-2015 development framework and sustainable development goals. It concludes that the one thing we cannot afford to allow is what happened with the MDGs in 2000. We must not leave the room empty-handed, but must instead ensure the inclusion of sexual and reproductive health and rights as a priority under a new health goal. Copyright © 2013 Reproductive Health Matters. Published by Elsevier Ltd. All rights reserved.

  15. Climate change & extreme weather vulnerability assessment framework.

    DOT National Transportation Integrated Search

    2012-12-01

    The Federal Highway Administrations (FHWAs) Climate Change and Extreme Weather Vulnerability : Assessment Framework is a guide for transportation agencies interested in assessing their vulnerability : to climate change and extreme weather event...

  16. [Neuroscience and collective memory: memory schemas linking brain, societies and cultures].

    PubMed

    Legrand, Nicolas; Gagnepain, Pierre; Peschanski, Denis; Eustache, Francis

    2015-01-01

    During the last two decades, the effect of intersubjective relationships on cognition has been an emerging topic in cognitive neurosciences leading through a so-called "social turn" to the formation of new domains integrating society and cultures to this research area. Such inquiry has been recently extended to collective memory studies. Collective memory refers to shared representations that are constitutive of the identity of a group and distributed among all its members connected by a common history. After briefly describing those evolutions in the study of human brain and behaviors, we review recent researches that have brought together cognitive psychology, neuroscience and social sciences into collective memory studies. Using the reemerging concept of memory schema, we propose a theoretical framework allowing to account for collective memories formation with a specific focus on the encoding process of historical events. We suggest that (1) if the concept of schema has been mainly used to describe rather passive framework of knowledge, such structure may also be implied in more active fashions in the understanding of significant collective events. And, (2) if some schema researches have restricted themselves to the individual level of inquiry, we describe a strong coherence between memory and cultural frameworks. Integrating the neural basis and properties of memory schema to collective memory studies may pave the way toward a better understanding of the reciprocal interaction between individual memories and cultural resources such as media or education. © Société de Biologie, 2016.

  17. Long-term knowledge acquisition using contextual information in a memory-inspired robot architecture

    NASA Astrophysics Data System (ADS)

    Pratama, Ferdian; Mastrogiovanni, Fulvio; Lee, Soon Geul; Chong, Nak Young

    2017-03-01

    In this paper, we present a novel cognitive framework allowing a robot to form memories of relevant traits of its perceptions and to recall them when necessary. The framework is based on two main principles: on the one hand, we propose an architecture inspired by current knowledge in human memory organisation; on the other hand, we integrate such an architecture with the notion of context, which is used to modulate the knowledge acquisition process when consolidating memories and forming new ones, as well as with the notion of familiarity, which is employed to retrieve proper memories given relevant cues. Although much research has been carried out, which exploits Machine Learning approaches to provide robots with internal models of their environment (including objects and occurring events therein), we argue that such approaches may not be the right direction to follow if a long-term, continuous knowledge acquisition is to be achieved. As a case study scenario, we focus on both robot-environment and human-robot interaction processes. In case of robot-environment interaction, a robot performs pick and place movements using the objects in the workspace, at the same time observing their displacement on a table in front of it, and progressively forms memories defined as relevant cues (e.g. colour, shape or relative position) in a context-aware fashion. As far as human-robot interaction is concerned, the robot can recall specific snapshots representing past events using both sensory information and contextual cues upon request by humans.

  18. Adapting current Arden Syntax knowledge for an object oriented event monitor.

    PubMed

    Choi, Jeeyae; Lussier, Yves A; Mendoça, Eneida A

    2003-01-01

    Arden Syntax for Medical Logic Module (MLM)1 was designed for writing and sharing task-specific health knowledge in 1989. Several researchers have developed frameworks to improve the sharability and adaptability of Arden Syntax MLMs, an issue known as "curly braces" problem. Karadimas et al proposed an Arden Syntax MLM-based decision support system that uses an object oriented model and the dynamic linking features of the Java platform.2 Peleg et al proposed creating a Guideline Expression Language (GEL) based on Arden Syntax's logic grammar.3 The New York Presbyterian Hospital (NYPH) has a collection of about 200 MLMs. In a process of adapting the current MLMs for an object-oriented event monitor, we identified two problems that may influence the "curly braces" one: (1) the query expressions within the curly braces of Arden Syntax used in our institution are cryptic to the physicians, institutional dependent and written ineffectively (unpublished results), and (2) the events are coded individually within a curly braces, resulting sometimes in a large number of events - up to 200.

  19. The research of .NET framework based on delegate of the LCE

    NASA Astrophysics Data System (ADS)

    Chen, Yi-peng

    2011-10-01

    Programmers realize LCE Enterprise services provided by NET framework when they develop applied VC# programming design language with component technology facing objects Lots of basic codes used to be compiled in the traditional programming design. However, nowadays this can be done just by adding corresponding character at class, interface, method, assembly with simple declarative program. This paper mainly expatiates the mechanism to realize LCE event services with delegate mode in C#. It also introduces the procedure of applying event class, event publisher, subscriber and client in LCE technology. It analyses the technology points of LCE based on delegate mode with popular language and practicing cases.

  20. Scaling an urban emergency evacuation framework : challenges and practices.

    DOT National Transportation Integrated Search

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist : attacks, etc., has significant impacts on urban transportation systems. We built a computational : framework to simulate urban transportation systems ...

  1. Analytical Framework for Identifying and Differentiating Recent Hitchhiking and Severe Bottleneck Effects from Multi-Locus DNA Sequence Data

    DOE PAGES

    Sargsyan, Ori

    2012-05-25

    Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This study develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction withmore » constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. Finally, the results show that significant discrepancies can exist between the estimates.« less

  2. Neural processing of emotional-intensity predicts emotion regulation choice.

    PubMed

    Shafir, Roni; Thiruchselvam, Ravi; Suri, Gaurav; Gross, James J; Sheppes, Gal

    2016-12-01

    Emotional-intensity is a core characteristic of affective events that strongly determines how individuals choose to regulate their emotions. Our conceptual framework suggests that in high emotional-intensity situations, individuals prefer to disengage attention using distraction, which can more effectively block highly potent emotional information, as compared with engagement reappraisal, which is preferred in low emotional-intensity. However, existing supporting evidence remains indirect because prior intensity categorization of emotional stimuli was based on subjective measures that are potentially biased and only represent the endpoint of emotional-intensity processing. Accordingly, this study provides the first direct evidence for the role of online emotional-intensity processing in predicting behavioral regulatory-choices. Utilizing the high temporal resolution of event-related potentials, we evaluated online neural processing of stimuli's emotional-intensity (late positive potential, LPP) prior to regulatory-choices between distraction and reappraisal. Results showed that enhanced neural processing of intensity (enhanced LPP amplitudes) uniquely predicted (above subjective measures of intensity) increased tendency to subsequently choose distraction over reappraisal. Additionally, regulatory-choices led to adaptive consequences, demonstrated in finding that actual implementation of distraction relative to reappraisal-choice resulted in stronger attenuation of LPPs and self-reported arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  3. Dysregulation in level of goal and action identification across psychological disorders.

    PubMed

    Watkins, Edward

    2011-03-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, "why" an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific "how" details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Dysregulation in level of goal and action identification across psychological disorders

    PubMed Central

    Watkins, Edward

    2011-01-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, “why” an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific “how” details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. PMID:20579789

  5. BioASF: a framework for automatically generating executable pathway models specified in BioPAX.

    PubMed

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap

    2016-06-15

    Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.

  6. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    PubMed

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  7. Bridging gaps in handoffs: a continuity of care based approach.

    PubMed

    Abraham, Joanna; Kannampallil, Thomas G; Patel, Vimla L

    2012-04-01

    Handoff among healthcare providers has been recognized as a major source of medical errors. Most prior research has often focused on the communication aspects of handoff, with limited emphasis on the overall handoff process, especially from a clinician workflow perspective. Such a workflow perspective that is based on the continuity of care model provides a framework required to identify and support an interconnected trajectory of care events affecting handoff communication. To this end, we propose a new methodology, referred to as the clinician-centered approach that allows us to investigate and represent the entire clinician workflow prior to, during and, after handoff communication. This representation of clinician activities supports a comprehensive analysis of the interdependencies in the handoff process across the care continuum, as opposed to a single discrete, information sharing activity. The clinician-centered approach is supported by multifaceted methods for data collection such as observations, shadowing of clinicians, audio recording of handoff communication, semi-structured interviews and artifact identification and collection. The analysis followed a two-stage mixed inductive-deductive method. The iterative development of clinician-centered approach was realized using a multi-faceted study conducted in the Medical Intensive Care Unit (MICU) of an academic hospital. Using the clinician-centered approach, we (a) identify the nature, inherent characteristics and the interdependencies between three phases of the handoff process and (b) develop a descriptive framework of handoff communication in critical care that captures the non-linear, recursive and interactive nature of collaboration and decision-making. The results reported in this paper serve as a "proof of concept" of our approach, emphasizing the importance of capturing a coordinated and uninterrupted succession of clinician information management and transfer activities in relation to patient care events. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. A Bayesian framework for infrasound location

    NASA Astrophysics Data System (ADS)

    Modrak, Ryan T.; Arrowsmith, Stephen J.; Anderson, Dale N.

    2010-04-01

    We develop a framework for location of infrasound events using backazimuth and infrasonic arrival times from multiple arrays. Bayesian infrasonic source location (BISL) developed here estimates event location and associated credibility regions. BISL accounts for unknown source-to-array path or phase by formulating infrasonic group velocity as random. Differences between observed and predicted source-to-array traveltimes are partitioned into two additive Gaussian sources, measurement error and model error, the second of which accounts for the unknown influence of wind and temperature on path. By applying the technique to both synthetic tests and ground-truth events, we highlight the complementary nature of back azimuths and arrival times for estimating well-constrained event locations. BISL is an extension to methods developed earlier by Arrowsmith et al. that provided simple bounds on location using a grid-search technique.

  9. Using Antelope and Seiscomp in the framework of the Romanian Seismic Network

    NASA Astrophysics Data System (ADS)

    Marius Craiu, George; Craiu, Andreea; Marmureanu, Alexandru; Neagoe, Cristian

    2014-05-01

    The National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T, STS2, SH-1, S13, Mark l4c, Ranger, Gs21, Mark 22) and acceleration sensors (Episensor Kinemetrics). The primary goal of the real-time seismic network is to provide earthquake parameters from more broad-band stations with a high dynamic range, for more rapid and accurate computation of the locations and magnitudes of earthquakes. The Seedlink and AntelopeTM program packages are completely automated Antelope seismological system is run at the Data Center in Măgurele. The Antelope data acquisition and processing software is running for real-time processing and post processing. The Antelope real-time system provides automatic event detection, arrival picking, event location, and magnitude calculation. It also provides graphical displays and automatic location within near real time after a local, regional or teleseismic event has occurred SeisComP 3 is another automated system that is run at the NIEP and which provides the following features: data acquisition, data quality control, real-time data exchange and processing, network status monitoring, issuing event alerts, waveform archiving and data distribution, automatic event detection and location, easy access to relevant information about stations, waveforms, and recent earthquakes. The main goal of this paper is to compare both of these data acquisitions systems in order to improve their detection capabilities, location accuracy, magnitude and depth determination and reduce the RMS and other location errors.

  10. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.

  11. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  12. An electrophysiological insight into visual attention mechanisms underlying schizotypy.

    PubMed

    Fuggetta, Giorgio; Bennett, Matthew A; Duke, Philip A

    2015-07-01

    A theoretical framework has been put forward to understand attention deficits in schizophrenia (Luck SJ & Gold JM. Biological Psychiatry. 2008; 64:34-39). We adopted this framework to evaluate any deficits in attentional processes in schizotypy. Sixteen low schizotypal (LoS) and 16 high schizotypal (HiS) individuals performed a novel paradigm combining a match-to-sample task, with inhibition of return (using spatially uninformative cues) and memory-guided efficient visual-search within one trial sequence. Behavioural measures and event-related potentials (ERPs) were recorded. Behaviourally, HiS individuals exhibited a spatial cueing effect while LoS individuals showed the more typical inhibition of return effect. These results suggest HiS individuals have a relative deficit in rule selection - the endogenous control process involved in disengaging attention from the uninformative location cue. ERP results showed that the late-phase of N2pc evoked by the target stimulus had greater peak latency and amplitude in HiS individuals. This suggests a relative deficit in the implementation of selection - the process of focusing attention onto target features that enhances relevant/suppresses irrelevant inputs. This is a different conclusion than when the same theoretical framework has been applied to schizophrenia, which argues little or no deficit in implementation of selection amongst patients. Also, HiS individuals exhibited earlier onset and greater amplitude of the mismatch-triggered negativity component. In summary, our results indicate deficits of both control and implementation of selection in HiS individuals. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Simulation and experimental studies of operators` decision styles and crew composition while using an ecological and traditional user interface for the control room of a nuclear power plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meshkati, N.; Buller, B.J.; Azadeh, M.A.

    1995-04-01

    The goal of this research is threefold: (1) use of the Skill-, Rule-, and Knowledge-based levels of cognitive control -- the SRK framework -- to develop an integrated information processing conceptual framework (for integration of workstation, job, and team design); (2) to evaluate the user interface component of this framework -- the Ecological display; and (3) to analyze the effect of operators` individual information processing behavior and decision styles on handling plant disturbances plus their performance on, and preference for, Traditional and Ecological user interfaces. A series of studies were conducted. In Part I, a computer simulation model and amore » mathematical model were developed. In Part II, an experiment was designed and conducted at the EBR-II plant of the Argonne National Laboratory-West in Idaho Falls, Idaho. It is concluded that: the integrated SRK-based information processing model for control room operations is superior to the conventional rule-based model; operators` individual decision styles and the combination of their styles play a significant role in effective handling of nuclear power plant disturbances; use of the Ecological interface results in significantly more accurate event diagnosis and recall of various plant parameters, faster response to plant transients, and higher ratings of subject preference; and operators` decision styles affect on both their performance and preference for the Ecological interface.« less

  14. EpiGeNet: A Graph Database of Interdependencies Between Genetic and Epigenetic Events in Colorectal Cancer.

    PubMed

    Balaur, Irina; Saqi, Mansoor; Barat, Ana; Lysenko, Artem; Mazein, Alexander; Rawlings, Christopher J; Ruskin, Heather J; Auffray, Charles

    2017-10-01

    The development of colorectal cancer (CRC)-the third most common cancer type-has been associated with deregulations of cellular mechanisms stimulated by both genetic and epigenetic events. StatEpigen is a manually curated and annotated database, containing information on interdependencies between genetic and epigenetic signals, and specialized currently for CRC research. Although StatEpigen provides a well-developed graphical user interface for information retrieval, advanced queries involving associations between multiple concepts can benefit from more detailed graph representation of the integrated data. This can be achieved by using a graph database (NoSQL) approach. Data were extracted from StatEpigen and imported to our newly developed EpiGeNet, a graph database for storage and querying of conditional relationships between molecular (genetic and epigenetic) events observed at different stages of colorectal oncogenesis. We illustrate the enhanced capability of EpiGeNet for exploration of different queries related to colorectal tumor progression; specifically, we demonstrate the query process for (i) stage-specific molecular events, (ii) most frequently observed genetic and epigenetic interdependencies in colon adenoma, and (iii) paths connecting key genes reported in CRC and associated events. The EpiGeNet framework offers improved capability for management and visualization of data on molecular events specific to CRC initiation and progression.

  15. Bridging the semantic gap in sports

    NASA Astrophysics Data System (ADS)

    Li, Baoxin; Errico, James; Pan, Hao; Sezan, M. Ibrahim

    2003-01-01

    One of the major challenges facing current media management systems and the related applications is the so-called "semantic gap" between the rich meaning that a user desires and the shallowness of the content descriptions that are automatically extracted from the media. In this paper, we address the problem of bridging this gap in the sports domain. We propose a general framework for indexing and summarizing sports broadcast programs. The framework is based on a high-level model of sports broadcast video using the concept of an event, defined according to domain-specific knowledge for different types of sports. Within this general framework, we develop automatic event detection algorithms that are based on automatic analysis of the visual and aural signals in the media. We have successfully applied the event detection algorithms to different types of sports including American football, baseball, Japanese sumo wrestling, and soccer. Event modeling and detection contribute to the reduction of the semantic gap by providing rudimentary semantic information obtained through media analysis. We further propose a novel approach, which makes use of independently generated rich textual metadata, to fill the gap completely through synchronization of the information-laden textual data with the basic event segments. An MPEG-7 compliant prototype browsing system has been implemented to demonstrate semantic retrieval and summarization of sports video.

  16. Analysing Surface Exposure to Climate Dynamics in the Himalayas to Adopt a Planning Framework for Landslide Risk Reduction

    NASA Astrophysics Data System (ADS)

    Tiwari, A.

    2017-12-01

    Himalayas rank first in the inventory of most densely populated and congested high altitude mountain regions of the planet. The region is mostly characterized by inadequate infrastructure, lack of mitigation tools along with constraints of terrain undermining the carrying capacity and resilience of urban ecosystems. Moreover, climate change has increased vulnerability of poor and marginalized population living in rapidly urbanizing mountain towns to increased frequency and severity of risks from extreme weather events. Such events pose multifold threat by easily translating to hazards, without the ability to respond and mitigate. Additionally, the recent extreme climate dynamics such as rainfall patterns have influenced the natural rate of surface/slope processes in the Himalaya. The aim of the study was to analyze the extent of interaction between climate dynamics and upland surface to develop participatory planning framework for landslide risk reduction using Integral Geographic Information System (integral GIS). At this stage, the study is limited to only rainfall triggered landslides (RTL). The study region lies in the middle Himalayan range (Himachal). Research utilized terrain analysis tools in integral GIS and identified risk susceptible surface without: 1.adding to its (often) complex fragmentation, and 2. Interference in surface/slope processes. Analysis covered most of the relevant surface factors including geology, slope instability, infrastructure development, natural and urban drainage system, land-cover and land-use as well. The outcome included an exposure-reduced model of existing terrain and the surface-process accommodated by it, with the use of local technical tools available among the poor and fragile mountain community. The final participatory planning framework successfully harmonized people's perception and adaptation knowledge, and incorporated priorities of local authorities. This research is significant as it rises above the fundamental challenges arising during management of the (often) conflicting perspectives, interests, and approaches of multiplicity of stakeholders thereby having vast potential to replicate/upscale in mountains beyond the study region as it ensures barrier free risk-communication through the most affordable and innovative tools.

  17. Remotely Monitored Sealing Array Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-12

    The Remotely Monitored Sealing Array (RMSA) utilizes the Secure Sensor Platform (SSP) framework to establish the fundamental operating capabilities for communication, security, power management, and cryptography. In addition to the SSP framework the RMSA software has unique capabilities to support monitoring a fiber optic seal. Fiber monitoring includes open and closed as well as parametric monitoring to detect tampering attacks. The fiber monitoring techniques, using the SSP power management processes, allow the seals to last for years while maintaining the security requirements of the monitoring application. The seal is enclosed in a tamper resistant housing with software to support activemore » tamper monitoring. New features include LED notification of fiber closure, the ability to retrieve the entire fiber optic history via translator command, separate memory storage for fiber optic events, and a more robust method for tracking and resending failed messages.« less

  18. Developing an Evaluation Framework of Spatial Understanding through GIS Analysis of Volunteered Geographic Information (VGI)

    ERIC Educational Resources Information Center

    Wu, Bing Sheng

    2013-01-01

    This study integrates volunteered geographic information (VGI) into GIS and contextual analyses, and develops a framework to evaluate students' understanding of "locations and places in order to set national and international events within a geographical framework and to understand basic spatial relationships" as proposed by the…

  19. Development of a Next Generation Concurrent Framework for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Calafiura, P.; Lampl, W.; Leggett, C.; Malon, D.; Stewart, G.; Wynne, B.

    2015-12-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. With current memory consumption for 64 bit ATLAS reconstruction in a high luminosity environment approaching 4GB, it will become impossible to fully occupy all cores in a machine without exhausting available memory. However, since maximizing performance per watt will be a key metric, a mechanism must be found to use all cores as efficiently as possible. In this paper we report on our progress with a practical demonstration of the use of multithreading in the ATLAS reconstruction software, using the GaudiHive framework. We have expanded support to Calorimeter, Inner Detector, and Tracking code, discussing what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on both the performance gains, and what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. We also present our findings on implementing a hybrid multi-threaded / multi-process framework, to take advantage of the strengths of each type of concurrency, while avoiding some of their corresponding limitations.

  20. [Relational Frame Theory--A Theoretical Framework for Contextual Behavioral Science].

    PubMed

    Kensche, M; Schweiger, U

    2015-07-01

    Therapists have to deal with verbal systems and often work with verbal exchange. Therefore, a psychological theory is required, which teaches the therapist how to accomplish this task. The BRT is a theory of human language and cognition that explains how people use their verbal behavior as stimuli in their interrelations and how they act and react, based on the resulting relationships. This behavior is learned very early in the course of language acquisition and functions as a generalized operant. A prerequisite for this is the ability of people to undergo mental simulation. This enables them to construct diverse relational frameworks between individual stimuli. Without relational frameworks, people cannot function. The ability to establish a relational framework is a prerequisite for the formation of rule-governed behavior. Rule-governed behavior economizes complex decision processes, creates interpersonal security and enables dealing with events before they take place. On the other hand, the same properties that enable people to solve problems effectively can also contribute to rigid adherence to rules and experience avoidance. Relational frameworks, once established, outweigh other sources of behavioral regulation. Thus, it can become the basis of psychopathology. Poor contextual control makes it difficult for people to devote flexible, focused and voluntary attention to the present and align their actions with the immediate present. Contextual psychotherapy methods that are based on the BRT start precisely at this point: Targeted establishment of new contingencies in the therapeutic interaction through systematic strengthening of metacognitive mode and through the establishment of new rules that make possible a change in the rule-governed behavior enable undermining of dysfunctional rule-governed behavior and build up desirable behavior. This allows any therapeutic process to be more effective--regardless of the patient's expressed symptoms. © Georg Thieme Verlag KG Stuttgart · New York.

  1. [Relational frame theory - a theoretical framework for contextual behavioral science].

    PubMed

    Kensche, M; Schweiger, U

    2015-05-01

    Therapists have to deal with verbal systems and often work with verbal exchange. Therefore, a psychological theory is required, which teaches the therapist how to accomplish this task. The BRT is a theory of human language and cognition that explains how people use their verbal behavior as stimuli in their interrelations and how they act and react, based on the resulting relationships. This behavior is learned very early in the course of language acquisition and functions as a generalized operant. A prerequisite for this is the ability of people to undergo mental simulation. This enables them to construct diverse relational frameworks between individual stimuli. Without relational frameworks, people cannot function. The ability to establish a relational framework is a prerequisite for the formation of rule-governed behavior. Rule-governed behavior economizes complex decision processes, creates interpersonal security and enables dealing with events before they take place. On the other hand, the same properties that enable people to solve problems effectively can also contribute to rigid adherence to rules and experience avoidance. Relational frameworks, once established, outweigh other sources of behavioral regulation. Thus, it can become the basis of psychopathology. Poor contextual control makes it difficult for people to devote flexible, focused and voluntary attention to the present and align their actions with the immediate present. Contextual psychotherapy methods that are based on the BRT start precisely at this point: Targeted establishment of new contingencies in the therapeutic interaction through systematic strengthening of metacognitive mode and through the establishment of new rules that make possible a change in the rule-governed behavior enable undermining of dysfunctional rule-governed behavior and build up desirable behavior. This allows any therapeutic process to be more effective - regardless of the patient's expressed symptoms. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Positive interventions: An emotion regulation perspective.

    PubMed

    Quoidbach, Jordi; Mikolajczak, Moïra; Gross, James J

    2015-05-01

    The rapid growth of the literature on positive interventions to increase "happiness" has suggested the need for an overarching conceptual framework to integrate the many and apparently disparate findings. In this review, we used the process model of emotion regulation (Gross, 1998) to organize the existing literature on positive interventions and to advance theory by clarifying the mechanisms underlying their effectiveness. We have proposed that positive emotions can be increased both in the short- and longer-term through 5 families of emotion regulation strategies (i.e., situation selection, situation modification, attentional deployment, cognitive change, and response modulation), showing how these emotion regulation strategies can be applied before, during, and after positive emotional events. Regarding short-term increases in positive emotions, our review found that attentional deployment, cognitive change, and response modulation strategies have received the most empirical support, whereas more work is needed to establish the effectiveness of situation selection and situation modification strategies. Regarding longer-term increases in positive emotions, strategies such as situation selection during an event and attentional deployment before, during, and after an event have received strong empirical support and are at the center of many positive interventions. However, more work is needed to establish the specific benefits of the other strategies, especially situation modification. We argue that our emotion regulation framework clarifies existing interventions and points the way for new interventions that might be used to increase positive emotions in both nonclinical and clinical populations. (c) 2015 APA, all rights reserved).

  3. Critical Events in the Lives of Interns

    PubMed Central

    Graham, Mark; Schmidt, Hilary; Stern, David T.; Miller, Steven Z.

    2008-01-01

    BACKGROUND Early residency is a crucial time in the professional development of physicians. As interns assume primary care for their patients, they take on new responsibilities. The events they find memorable during this time could provide us with insight into their developing professional identities. OBJECTIVE To evaluate the most critical events in the lives of interns. PARTICIPANTS Forty-one internal medicine residents at one program participated in a two-day retreat in the fall of their first year. Each resident provided a written description of a recent high point, low point, and patient conflict. MEASUREMENTS We used a variant of grounded theory to analyze these critical incidents and determine the underlying themes of early internship. Independent inter-rater agreement of >90% was achieved for the coding of excerpts. MAIN RESULTS The 123 critical incidents were clustered into 23 categories. The categories were further organized into six themes: confidence, life balance, connections, emotional responses, managing expectations, and facilitating teamwork. High points were primarily in the themes of confidence and connections. Low points were dispersed more generally throughout the conceptual framework. Conflicts with patients were about negotiating the expectations inherent in the physician–patient relationship. CONCLUSION The high points, low points, and conflicts reported by early residents provide us with a glimpse into the lives of interns. The themes we have identified reflect critical challenges interns face the development of their professional identity. Program directors could use this process and conceptual framework to guide the development and promotion of residents’ emerging professional identities. PMID:18972091

  4. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  5. A data skimming service for locally resident analysis data

    NASA Astrophysics Data System (ADS)

    Cranshaw, J.; Gardner, R. W.; Gieraltowski, J.; Malon, D.; Mambelli, M.; May, E.

    2008-07-01

    A Data Skimming Service (DSS) is a site-level service for rapid event filtering and selection from locally resident datasets based on metadata queries to associated 'tag' databases. In US ATLAS, we expect most if not all of the AOD-based datasets to be replicated to each of the five Tier 2 regional facilities in the US Tier 1 'cloud' coordinated by Brookhaven National Laboratory. Entire datasets will consist of on the order of several terabytes of data, and providing easy, quick access to skimmed subsets of these data will be vital to physics working groups. Typically, physicists will be interested in portions of the complete datasets, selected according to event-level attributes (number of jets, missing Et, etc) and content (specific analysis objects for subsequent processing). In this paper we describe methods used to classify data (metadata tag generation) and to store these results in a local database. Next we discuss a general framework which includes methods for accessing this information, defining skims, specifying event output content, accessing locally available storage through a variety of interfaces (SRM, dCache/dccp, gridftp), accessing remote storage elements as specified, and user job submission tools through local or grid schedulers. The advantages of the DSS are the ability to quickly 'browse' datasets and design skims, for example, pre-adjusting cuts to get to a desired skim level with minimal use of compute resources, and to encode these analysis operations in a database for re-analysis and archival purposes. Additionally the framework has provisions to operate autonomously in the event that external, central resources are not available, and to provide, as a reduced package, a minimal skimming service tailored to the needs of small Tier 3 centres or individual users.

  6. Bayesian selection of Markov models for symbol sequences: application to microsaccadic eye movements.

    PubMed

    Bettenbühl, Mario; Rusconi, Marco; Engbert, Ralf; Holschneider, Matthias

    2012-01-01

    Complex biological dynamics often generate sequences of discrete events which can be described as a Markov process. The order of the underlying Markovian stochastic process is fundamental for characterizing statistical dependencies within sequences. As an example for this class of biological systems, we investigate the Markov order of sequences of microsaccadic eye movements from human observers. We calculate the integrated likelihood of a given sequence for various orders of the Markov process and use this in a Bayesian framework for statistical inference on the Markov order. Our analysis shows that data from most participants are best explained by a first-order Markov process. This is compatible with recent findings of a statistical coupling of subsequent microsaccade orientations. Our method might prove to be useful for a broad class of biological systems.

  7. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  8. Simplifying operations with an uplink/downlink integration toolkit

    NASA Astrophysics Data System (ADS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-11-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  9. A Bridge between Traumatic Life Events and Losses by Death.

    ERIC Educational Resources Information Center

    Trolley, Barbara C.

    1994-01-01

    Provides support for connection between grief reactions and traumatic life events. Discusses reactions to life traumas (alcoholism, abuse, disability, divorce, infertility) within context of grief framework. Applies literature pertaining to responses to suicide and murder to traumatic life events. Discusses and dissolves proposed differences…

  10. Description and detection of burst events in turbulent flows

    NASA Astrophysics Data System (ADS)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  11. Research and Evaluations of the Health Aspects of Disasters, Part II: The Disaster Health Conceptual Framework Revisited.

    PubMed

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro

    2015-10-01

    A Conceptual Framework upon which the study of disasters can be organized is essential for understanding the epidemiology of disasters, as well as the interventions/responses undertaken. Application of the structure provided by the Conceptual Framework should facilitate the development of the science of Disaster Health. This Framework is based on deconstructions of the commonly used Disaster Management Cycle. The Conceptual Framework incorporates the steps that occur as a hazard progresses to a disaster. It describes an event that results from the changes in the release of energy from a hazard that may cause Structural Damages that in turn, may result in Functional Damages (decreases in levels of function) that produce needs (goods and services required). These needs can be met by the goods and services that are available during normal, day-to-day operations of the community, or the resources that are contained within the community's Response Capacity (ie, an Emergency), or by goods and services provided from outside of the affected area (outside response capacities). Whenever the Local Response Capacity is unable to meet the needs, and the Response Capacities from areas outside of the affected community are required, a disaster occurs. All responses, whether in the Relief or Recovery phases of a disaster, are interventions that use the goods, services, and resources contained in the Response Capacity (local or outside). Responses may be directed at preventing/mitigating further deterioration in levels of functions (damage control, deaths, injuries, diseases, morbidity, and secondary events) in the affected population and filling the gaps in available services created by Structural Damages (compromise in available goods, services, and/or resources; ie, Relief Responses), or may be directed toward returning the affected community and its components to the pre-event functional state (ie, Recovery Responses). Hazard Mitigation includes interventions designed to decrease the likelihood that a hazard will cause an event, and should an event occur, that the amount of energy released will be reduced. Capacity Building consists of all interventions undertaken before an event occurs in order to increase the resilience of the community to an event related to a hazard that exists in an area-at-risk. Resilience is the combination of the Absorbing, Buffering, and Response Capacities of a community-at-risk, and is enhanced through Capacity-Building efforts. A disaster constitutes a failure of resilience.

  12. When parsimony is not enough: Considering dual processes and dual levels of influence in sexual decision making

    PubMed Central

    Rendina, H. Jonathon

    2015-01-01

    The literature on sexual decision making that has been used to understand behaviors relevant to HIV and STI risk has relied primarily on cognitive antecedents of behavior. In contrast, several prominent models of decision making outside of the sexual behavior literature rely on dual process models, in which both affective and cognitive processing are considered important precursors to behavior. Moreover, much of the literature on sexual behavior utilizes individual-level traits and characteristics to predict aggregated sexual behavior, despite decision making itself being a situational or event-level process. This paper proposes a framework for understanding sexual decision making as the result of dual processes (affective and cognitive) operating at dual level of influence (individual and situational). Finally, the paper ends with a discussion of the conceptual and methodological benefits and challenges to its use and future directions for research. PMID:26168978

  13. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    NASA Astrophysics Data System (ADS)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.

  14. The media and genetically modified foods: evidence in support of social amplification of risk.

    PubMed

    Frewer, Lynn J; Miles, Susan; Marsh, Roy

    2002-08-01

    Empirical examinations of the "social amplification of risk" framework are rare, partly because of the difficulties in predicting when conditions likely to result in amplification effects will occur. This means that it is difficult to examine changes in risk perception that are contemporaneous with increases and/or decreases in social or media discussion of the risks associated with a particular risk event. However, the collection of attitude data before, during, and after the increased reporting of the risks of genetically modified food in the United Kingdom (spring 1999) has demonstrated that people's risk perceptions do increase and decrease in line with what might be expected upon examination of the amplification and attenuation mechanisms integral to the framework. Perceptions of benefit, however, appeared to be permanently depressed by negative reporting about genetically modified food. Trust in regulatory institutions with responsibility for protecting the public was not affected. It was concluded that the social amplification of risk framework is a useful framework for beginning to explain the potential impact on risk perceptions of a risk event, particularly if that risk event is presented to the public as a new hazard occurring in a crisis context.

  15. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. [Change in the event-related skin conductivity: an indicator of the immediate importance of elaborate information processing?].

    PubMed

    Zimmer, H

    1992-01-01

    In recent psychophysiological conceptualizations of the orienting response (OR) within the framework of information processing, the OR is increasingly considered a "call for processing resources", something which is especially inferred from variations in the event-related skin conductance response (SCR). The present study, therefore, was concerned with certain implications arising from this framework or perspective, particularly in regard to the question of whether stimuli eliciting skin conductance responses obligatorily receive/evoke processing priority or not. In order to examine whether these electrodermal responses denote a capturing of attention or merely a call for processing resources, short (1 s) pure sine tones of 65 dB with sudden onset (commonly used as orienting stimuli) were inserted in a reaction time paradigm with an additional memory load. This demand was primarily given because memory processes play a key role in theories of orienting and habituation. The task was run under two different conditions of complexity, factorially combined with a novelty variation of the added auditory stimuli. The results revealed a substantial deterioration of task performance subsequent to the occurrence of the tones, which, however, was dependent on task complexity and on novelty of the tones. The task impairment is particularly remarkable as subjects were asked to avoid distractions by paying attention to the task and as the tones were introduced as subsidiary and task-irrelevant. Together with the missing effects of task complexity on phasic and tonic electrodermal activity, results suggest that information-processing conceptualizations of the OR can only be a meaningful heuristic contribution to theoretical developments about human orienting and its habituation if the setting of processing priority, its conditions, as well as its implications are adequately taken into account. In addition, it seems to be promising to consider the strength of the SCR as an index of urgency of elaborate, attention-demanding processing and not as a peripheral physiological manifestation of the OR, or, respectively, of a call for unspecific processing resources. Such a view would also do justice to the aspect of prioritization. The sufficient conditions for an OR's occurrence could, in this context, be equated with, among others, some of those which activate a mechanism subserving selective attention and, as a possible result, which lead to further and more elaborate processing of potentially important information.

  17. The neurobiology of syntax: beyond string sets.

    PubMed

    Petersson, Karl Magnus; Hagoort, Peter

    2012-07-19

    The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty.

  18. The neurobiology of syntax: beyond string sets

    PubMed Central

    Petersson, Karl Magnus; Hagoort, Peter

    2012-01-01

    The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty. PMID:22688633

  19. Nearly suppressed photoluminescence blinking of small-sized, blue-green-orange-red emitting single CdSe-based core/gradient alloy shell/shell quantum dots: correlation between truncation time and photoluminescence quantum yield.

    PubMed

    Roy, Debjit; Mandal, Saptarshi; De, Chayan K; Kumar, Kaushalendra; Mandal, Prasun K

    2018-04-18

    CdSe-based core/gradient alloy shell/shell semiconductor quantum dots (CGASS QDs) have been shown to be optically quite superior compared to core-shell QDs. However, very little is known about CGASS QDs at the single particle level. Photoluminescence blinking dynamics of four differently emitting (blue (λem = 510), green (λem = 532), orange (λem = 591), and red (λem = 619)) single CGASS QDs having average sizes <∼7 nm have been probed in our home-built total internal reflection fluorescence (TIRF) microscope. All four samples possess an average ON-fraction of 0.70-0.85, which hints towards nearly suppressed PL blinking in these gradiently alloyed systems. Suppression of blinking has been so far achieved with QDs having sizes greater than 10 nm and mostly emitting in the red region (λem > 600 nm). In this manuscript, we report nearly suppressed PL blinking behaviour of CGASS QDs with average sizes <∼7 nm and emitting in the entire range of the visible spectrum, i.e. from blue to green to orange to red. The probability density distribution of both ON- and OFF-event durations for all of these CGASS QDs could be fitted well with a modified inverse truncated power law with an additional exponential model equation. It has been found that unlike most of the literature reports, the power law exponent for OFF-event durations is greater than the power law exponent for ON-event durations for all four samples. This suggests that relatively large ON-event durations are interrupted by comparatively small OFF-event durations. This in turn is indicative of a suppressed non-radiative Auger recombination process for these CGASS systems. However, in these four different samples the ON-event truncation time varies inversely with the OFF-event truncation time, which hints that both the ON- and OFF-event truncation processes are dictated by some common factor. We have employed 2D joint probability distribution analysis to probe the correlation between the event durations and found that residual memory exists in both the ON- and OFF-event durations. Positively correlated successive ON-ON and OFF-OFF event durations and negatively correlated (anti-correlated) ON-OFF event durations perhaps suggest the involvement of more than one type of trapping process within the blinking framework. The timescale corresponding to the additional exponential term has been assigned to hole trapping for ON-event duration statistics. Similarly, for OFF-event duration statistics, this component suggests hole detrapping. We found that the average duration of the exponential process for the ON-event durations is an order of magnitude higher than that of the OFF-event durations. This indicates that the holes are trapped for a significantly long time. When electron trapping is followed by such a hole trapping, long ON-event durations result. We have observed long ON-event durations, as high as 50 s. The competing charge tunnelling model has been used to account for the observed blinking behaviour in these CGASS QDs. Quite interestingly, the PLQY of all of these differently emitting QDs (an ensemble level property) could be correlated with the truncation time (a property at the single particle level). A respective concomitant increase-decrease of ON-OFF event truncation times with increasing PLQY is also indicative of a varying degree of suppression of the Auger recombination processes in these four different CGASS QDs.

  20. Bayes Nets and Babies: Infants' Developing Statistical Reasoning Abilities and Their Representation of Causal Knowledge

    ERIC Educational Resources Information Center

    Sobel, David M.; Kirkham, Natasha Z.

    2007-01-01

    A fundamental assumption of the causal graphical model framework is the Markov assumption, which posits that learners can discriminate between two events that are dependent because of a direct causal relation between them and two events that are independent conditional on the value of another event(s). Sobel and Kirkham (2006) demonstrated that…

  1. Forecasts, warnings and social response to flash floods: Is temporality a major problem? The case of the September 2005 flash flood in the Gard region (France)

    NASA Astrophysics Data System (ADS)

    Lutoff, C.; Anquetin, S.; Ruin, I.; Chassande, M.

    2009-09-01

    Flash floods are complex phenomena. The atmospheric and hydrological generating mechanisms of the phenomenon are not completely understood, leading to highly uncertain forecasts of and warnings for these events. On the other hand warning and crisis response to such violent and fast events is not a straightforward process. In both the social and physical aspect of the problem, space and time scales involved either in hydrometeorology, human behavior and social organizations sciences are of crucial importance. Forecasters, emergency managers, mayors, school superintendents, school transportation managers, first responders and road users, all have different time and space frameworks that they use to take emergency decision for themselves, their group or community. The integration of space and time scales of both the phenomenon and human activities is therefore a necessity to better deal with questions as forecasting lead-time and warning efficiency. The aim of this oral presentation is to focus on the spatio-temporal aspects of flash floods to improve our understanding of the event dynamic compared to the different scales of the social response. The authors propose a framework of analysis to compare the temporality of: i) the forecasts (from Méteo-France and from EFAS (Thielen et al., 2008)), ii) the meteorological and hydrological parameters, iii) the social response at different scales. The September 2005 event is particularly interesting for such analysis. The rainfall episode lasted nearly a week with two distinct phases separated by low intensity precipitations. Therefore the Méteo-France vigilance bulletin where somehow disconnected from the local flood’s impacts. Our analysis focuses on the timings of different types of local response, including the delicate issue of school transportation, in regard to the forecasts and the actual dynamic of the event.

  2. A Framework of Knowledge Integration and Discovery for Supporting Pharmacogenomics Target Predication of Adverse Drug Events: A Case Study of Drug-Induced Long QT Syndrome.

    PubMed

    Jiang, Guoqian; Wang, Chen; Zhu, Qian; Chute, Christopher G

    2013-01-01

    Knowledge-driven text mining is becoming an important research area for identifying pharmacogenomics target genes. However, few of such studies have been focused on the pharmacogenomics targets of adverse drug events (ADEs). The objective of the present study is to build a framework of knowledge integration and discovery that aims to support pharmacogenomics target predication of ADEs. We integrate a semantically annotated literature corpus Semantic MEDLINE with a semantically coded ADE knowledgebase known as ADEpedia using a semantic web based framework. We developed a knowledge discovery approach combining a network analysis of a protein-protein interaction (PPI) network and a gene functional classification approach. We performed a case study of drug-induced long QT syndrome for demonstrating the usefulness of the framework in predicting potential pharmacogenomics targets of ADEs.

  3. Application of a framework for extrapolating chemical effects ...

    EPA Pesticide Factsheets

    Cross-species extrapolation of toxicity data from limited surrogate test organisms to all wildlife with potential of chemical exposure remains a key challenge in ecological risk assessment. A number of factors affect extrapolation, including the chemical exposure, pharmacokinetics, life-stage, and pathway similarities/differences. Here we propose a framework using a tiered approach for species extrapolation that enables a transparent weight-of-evidence driven evaluation of pathway conservation (or lack thereof) in the context of adverse outcome pathways. Adverse outcome pathways describe the linkages from a molecular initiating event, defined as the chemical-biomolecule interaction, through subsequent key events leading to an adverse outcome of regulatory concern (e.g., mortality, reproductive dysfunction). Tier 1 of the extrapolation framework employs in silico evaluations of sequence and structural conservation of molecules (e.g., receptors, enzymes) associated with molecular initiating events or upstream key events. Such evaluations make use of available empirical and sequence data to assess taxonomic relevance. Tier 2 uses in vitro bioassays, such as enzyme inhibition/activation, competitive receptor binding, and transcriptional activation assays to explore functional conservation of pathways across taxa. Finally, Tier 3 provides a comparative analysis of in vivo responses between species utilizing well-established model organisms to assess departure from

  4. Event Driven Messaging with Role-Based Subscriptions

    NASA Technical Reports Server (NTRS)

    Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed

    2009-01-01

    Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).

  5. Climate change and mental health: a causal pathways framework.

    PubMed

    Berry, Helen Louise; Bowen, Kathryn; Kjellstrom, Tord

    2010-04-01

    Climate change will bring more frequent, long lasting and severe adverse weather events and these changes will affect mental health. We propose an explanatory framework to enhance consideration of how these effects may operate and to encourage debate about this important aspect of the health impacts of climate change. Literature review. Climate change may affect mental health directly by exposing people to trauma. It may also affect mental health indirectly, by affecting (1) physical health (for example, extreme heat exposure causes heat exhaustion in vulnerable people, and associated mental health consequences) and (2) community wellbeing. Within community, wellbeing is a sub-process in which climate change erodes physical environments which, in turn, damage social environments. Vulnerable people and places, especially in low-income countries, will be particularly badly affected. Different aspects of climate change may affect mental health through direct and indirect pathways, leading to serious mental health problems, possibly including increased suicide mortality. We propose that it is helpful to integrate these pathways in an explanatory framework, which may assist in developing public health policy, practice and research.

  6. Web Video Event Recognition by Semantic Analysis From Ubiquitous Documents.

    PubMed

    Yu, Litao; Yang, Yang; Huang, Zi; Wang, Peng; Song, Jingkuan; Shen, Heng Tao

    2016-12-01

    In recent years, the task of event recognition from videos has attracted increasing interest in multimedia area. While most of the existing research was mainly focused on exploring visual cues to handle relatively small-granular events, it is difficult to directly analyze video content without any prior knowledge. Therefore, synthesizing both the visual and semantic analysis is a natural way for video event understanding. In this paper, we study the problem of Web video event recognition, where Web videos often describe large-granular events and carry limited textual information. Key challenges include how to accurately represent event semantics from incomplete textual information and how to effectively explore the correlation between visual and textual cues for video event understanding. We propose a novel framework to perform complex event recognition from Web videos. In order to compensate the insufficient expressive power of visual cues, we construct an event knowledge base by deeply mining semantic information from ubiquitous Web documents. This event knowledge base is capable of describing each event with comprehensive semantics. By utilizing this base, the textual cues for a video can be significantly enriched. Furthermore, we introduce a two-view adaptive regression model, which explores the intrinsic correlation between the visual and textual cues of the videos to learn reliable classifiers. Extensive experiments on two real-world video data sets show the effectiveness of our proposed framework and prove that the event knowledge base indeed helps improve the performance of Web video event recognition.

  7. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  8. Updating the Framework Geology of Padre Island National Seashore: Validation of Geophysical Surveys through Sediment Cores

    NASA Astrophysics Data System (ADS)

    Tuttle, L. F., II; Wernette, P. A.; Houser, C.

    2016-12-01

    Framework geology has been demonstrated to influence the geomorphology and affect the response of barrier islands to extreme storm events. Therefore, it is vital that we understand the framework geology before we can accurately assess the vulnerability and resiliency of the coast. Geophysical surveys consisting of ground-penetrating radar (GPR) and electromagnetic inductance (EMI) were collected along the length of Padre Island National Seashore (PAIS) to map subsurface infilled paleochannels identified in previous research. The most extensive published survey of PAIS framework geology was conducted in the 1950s as part of dredging the Intracoastal Waterway through Laguna Madre. Using cores and seismic surveys the previous study identified a series of relict infilled paleochannels in dissecting PAIS. The sediment cores presented in our poster were collected in Fall 2016 with a Geoprobe 6712DT. Cores were stored and processed using an X-ray fluorescence (XRF) scanner at the International Ocean Discovery Program repository in College Station, Texas. The XRF data was used to examine mineralogical differences that provide valuable insight into the evolutionary history of the island. This poster presents results from sediment cores collected to validate the geophysical survey data. The broader purpose of this research is to validate the subsurface framework geology features (i.e. infilled paleochannels) in order to more accurately predict future changes to the environmental and economic longevity of PAIS.

  9. Testing decision rules for categorizing species' extinction risk to help develop quantitative listing criteria for the U.S. Endangered Species Act.

    PubMed

    Regan, Tracey J; Taylor, Barbara L; Thompson, Grant G; Cochrane, Jean Fitts; Ralls, Katherine; Runge, Michael C; Merrick, Richard

    2013-08-01

    Lack of guidance for interpreting the definitions of endangered and threatened in the U.S. Endangered Species Act (ESA) has resulted in case-by-case decision making leaving the process vulnerable to being considered arbitrary or capricious. Adopting quantitative decision rules would remedy this but requires the agency to specify the relative urgency concerning extinction events over time, cutoff risk values corresponding to different levels of protection, and the importance given to different types of listing errors. We tested the performance of 3 sets of decision rules that use alternative functions for weighting the relative urgency of future extinction events: a threshold rule set, which uses a decision rule of x% probability of extinction over y years; a concave rule set, where the relative importance of future extinction events declines exponentially over time; and a shoulder rule set that uses a sigmoid shape function, where relative importance declines slowly at first and then more rapidly. We obtained decision cutoffs by interviewing several biologists and then emulated the listing process with simulations that covered a range of extinction risks typical of ESA listing decisions. We evaluated performance of the decision rules under different data quantities and qualities on the basis of the relative importance of misclassification errors. Although there was little difference between the performance of alternative decision rules for correct listings, the distribution of misclassifications differed depending on the function used. Misclassifications for the threshold and concave listing criteria resulted in more overprotection errors, particularly as uncertainty increased, whereas errors for the shoulder listing criteria were more symmetrical. We developed and tested the framework for quantitative decision rules for listing species under the U.S. ESA. If policy values can be agreed on, use of this framework would improve the implementation of the ESA by increasing transparency and consistency. Conservation Biology © 2013 Society for Conservation Biology No claim to original US government works.

  10. Decision-making in crisis: Applying a healthcare triage methodology to business continuity management.

    PubMed

    Moore, Bethany; Bone, Eric A

    2017-01-01

    The concept of triage in healthcare has been around for centuries and continues to be applied today so that scarce resources are allocated according to need. A business impact analysis (BIA) is a form of triage in that it identifies which processes are most critical, which to address first and how to allocate limited resources. On its own, however, the BIA provides only a roadmap of the impacts and interdependencies of an event. When disaster strikes, organisational decision-makers often face difficult decisions with regard to allocating limited resources between multiple 'mission-critical' functions. Applying the concept of triage to business continuity provides those decision-makers navigating a rapidly evolving and unpredictable event with a path that protects the fundamental priorities of the organisation. A business triage methodology aids decision-makers in times of crisis by providing a simplified framework for decision-making based on objective, evidence-based criteria, which is universally accepted and understood. When disaster strikes, the survival of the organisation depends on critical decision-making and quick actions to stabilise the incident. This paper argues that organisations need to supplement BIA processes with a decision-making triage methodology that can be quickly applied during the chaos of an actual event.

  11. Breaking Ice: Fracture Processes in Floating Ice on Earth and Elsewhere

    NASA Astrophysics Data System (ADS)

    Scambos, T. A.

    2016-12-01

    Rapid, intense fracturing events in the ice shelves of the Antarctic Peninsula reveal a set of processes that were not fully appreciated prior to the series of ice shelf break-ups observed in the late 1990s and early 2000s. A series of studies have uncovered a fascinating array of relationships between climate, ocean, and ice: intense widespread hydrofracture; repetitive hydrofracture induced by ice plate bending; the ability for sub-surface flooded firn to support hydrofracture; potential triggering by long-period wave action; accelerated fracturing by trapped tsunamic waves; iceberg disintegration, and a remarkable ice rebound process from lake drainage that resembles runaway nuclear fission. The events and subsequent studies have shown that rapid regional warming in ice shelf areas leads to catastrophic changes in a previously stable ice mass. More typical fracturing of thick ice plates is a natural consequence of ice flow in a complex geographic setting, i.e., it is induced by shear and divergence of spreading plate flow around obstacles. While these are not a result of climate or ocean change, weather and ocean processes may impact the exact timing of final separation of an iceberg from a shelf. Taking these terrestrial perspectives to other ice-covered ocean worlds, cautiously, provides an observational framework for interpreting features on Europa and Enceladus.

  12. Assessment of wildland fire impacts on watershed annual water yield: Analytical framework and case studies in the United States

    DOE PAGES

    Hallema, Dennis W.; Sun, Ge; Caldwell, Peter V.; ...

    2016-11-29

    More than 50% of water supplies in the conterminous United States originate on forestland or rangeland and are potentially under increasing stress as a result of larger and more severe wildfires. Little is known, however, about the long-term impacts of fire on annual water yield and the role of climate variability within this context. We here propose a framework for evaluating wildland fire impacts on streamflow that combines double-mass analysis with new methods (change point analysis, climate elasticity modeling, and process-based modeling) to distinguish between multiyear fire and climate impacts. The framework captures a wide range of fire types, watershedsmore » characteristics, and climate conditions using streamflow data, as opposed to other approaches requiring paired watersheds. The process is illustrated with three case studies. A watershed in Arizona experienced a +266% increase in annual water yield in the 5 years after a wildfire, where +219% was attributed to wildfire and +24% to precipitation trends. In contrast, a California watershed had a lower (–64%) post-fire net water yield, comprised of enhanced flow (+38%) attributed to wildfire offset (–102%) by lower precipitation in the post-fire period. Changes in streamflow within a watershed in South Carolina had no apparent link to periods of prescribed burning but matched a very wet winter and reports of storm damage. As a result, the presented framework is unique in its ability to detect and quantify fire or other disturbances, even if the date or nature of the disturbance event is uncertain, and regardless of precipitation trends.« less

  13. Assessment of wildland fire impacts on watershed annual water yield: Analytical framework and case studies in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallema, Dennis W.; Sun, Ge; Caldwell, Peter V.

    More than 50% of water supplies in the conterminous United States originate on forestland or rangeland and are potentially under increasing stress as a result of larger and more severe wildfires. Little is known, however, about the long-term impacts of fire on annual water yield and the role of climate variability within this context. We here propose a framework for evaluating wildland fire impacts on streamflow that combines double-mass analysis with new methods (change point analysis, climate elasticity modeling, and process-based modeling) to distinguish between multiyear fire and climate impacts. The framework captures a wide range of fire types, watershedsmore » characteristics, and climate conditions using streamflow data, as opposed to other approaches requiring paired watersheds. The process is illustrated with three case studies. A watershed in Arizona experienced a +266% increase in annual water yield in the 5 years after a wildfire, where +219% was attributed to wildfire and +24% to precipitation trends. In contrast, a California watershed had a lower (–64%) post-fire net water yield, comprised of enhanced flow (+38%) attributed to wildfire offset (–102%) by lower precipitation in the post-fire period. Changes in streamflow within a watershed in South Carolina had no apparent link to periods of prescribed burning but matched a very wet winter and reports of storm damage. As a result, the presented framework is unique in its ability to detect and quantify fire or other disturbances, even if the date or nature of the disturbance event is uncertain, and regardless of precipitation trends.« less

  14. Spatial and temporal patterns of bank failure during extreme flood events: Evidence of nonlinearity and self-organised criticality at the basin scale?

    NASA Astrophysics Data System (ADS)

    Thompson, C. J.; Croke, J. C.; Grove, J. R.

    2012-04-01

    Non-linearity in physical systems provides a conceptual framework to explain complex patterns and form that are derived from complex internal dynamics rather than external forcings, and can be used to inform modeling and improve landscape management. One process that has been investigated previously to explore the existence of self-organised critical system (SOC) in river systems at the basin-scale is bank failure. Spatial trends in bank failure have been previously quantified to determine if the distribution of bank failures at the basin scale exhibit the necessary power law magnitude/frequency distributions. More commonly bank failures are investigated at a small-scale using several cross-sections with strong emphasis on local-scale factors such as bank height, cohesion and hydraulic properties. Advancing our understanding of non-linearity in such processes, however, requires many more studies where both the spatial and temporal measurements of the process can be used to investigate the existence or otherwise of non-linearity and self-organised criticality. This study presents measurements of bank failure throughout the Lockyer catchment in southeast Queensland, Australia, which experienced an extreme flood event in January 2011 resulting in the loss of human lives and geomorphic channel change. The most dominant form of fluvial adjustment consisted of changes in channel geometry and notably widespread bank failures, which were readily identifiable as 'scalloped' shaped failure scarps. The spatial extents of these were mapped using high-resolution LiDAR derived digital elevation model and were verified by field surveys and air photos. Pre-flood event LiDAR coverage for the catchment also existed allowing direct comparison of the magnitude and frequency of bank failures from both pre and post-flood time periods. Data were collected and analysed within a GIS framework and investigated for power-law relationships. Bank failures appeared random and occurred throughout the basin but plots of magnitude and frequency did display power-law scaling of failures. In addition, there was a lack of site specific correlations between bank failure and other factors such channel width, bank height and stream power. The data are used here to discuss the existence of SOC in fluvial systems and the relative role of local and basin-wide processes in influencing their distribution in space and time.

  15. Framework for probabilistic flood risk assessment in an Alpine region

    NASA Astrophysics Data System (ADS)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the historic and synthetic flood events have to be spatially interpolated from point scale (i.e. river gauges) to the river network. Therefore, topological kriging (Top-kriging) proposed by Skøien et al. (2006) is applied. Top-kriging considers the nested structure of river networks and is therefore suitable to regionalise flood characteristics. Thus, the characteristics of a large number of possible flood events can be transferred to arbitrary locations (e.g. community level) at the river network within a study region. This framework has been used to generate a set of spatial correlated river flood events in the Austrian Federal Province of Vorarlberg. In addition, loss-probability-curves for each community has been calculated based on official inundation maps of public authorities, elements at risks and their vulnerability. One location along the river network within each community refers as interface between the set of flood events and the individual loss-probability relationships for the individual communities. Consequently, every flood event from the historic and synthetic generated dataset can be monetary evaluated. Thus, a time series comprising a large number of flood events and their corresponding monetary losses serves as basis for a probabilistic flood risk assessment. This includes expected annual losses and estimates of extreme event losses, which occur over the course of a certain time period. The gained results are essential decision-support for primary insurers, reinsurance companies and public authorities in order to setup a scale adequate risk management.

  16. Intelligent earthquake data processing for global adjoint tomography

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Hill, J.; Li, T.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Due to the increased computational capability afforded by modern and future computing architectures, the seismology community is demanding a more comprehensive understanding of the full waveform information from the recorded earthquake seismograms. Global waveform tomography is a complex workflow that matches observed seismic data with synthesized seismograms by iteratively updating the earth model parameters based on the adjoint state method. This methodology allows us to compute a very accurate model of the earth's interior. The synthetic data is simulated by solving the wave equation in the entire globe using a spectral-element method. In order to ensure the inversion accuracy and stability, both the synthesized and observed seismograms must be carefully pre-processed. Because the scale of the inversion problem is extremely large and there is a very large volume of data to both be read and written, an efficient and reliable pre-processing workflow must be developed. We are investigating intelligent algorithms based on a machine-learning (ML) framework that will automatically tune parameters for the data processing chain. One straightforward application of ML in data processing is to classify all possible misfit calculation windows into usable and unusable ones, based on some intelligent ML models such as neural network, support vector machine or principle component analysis. The intelligent earthquake data processing framework will enable the seismology community to compute the global waveform tomography using seismic data from an arbitrarily large number of earthquake events in the fastest, most efficient way.

  17. United We Stand: Emphasizing Commonalities Across Cognitive-Behavioral Therapies

    PubMed Central

    Mennin, Douglas S.; Ellard, Kristen K.; Fresco, David M.; Gross, James J.

    2016-01-01

    Cognitive behavioral therapy (CBT) has a rich history of alleviating the suffering associated with mental disorders. Recently, there have been exciting new developments, including multi-component approaches, incorporated alternative therapies (e.g., meditation), targeted and cost-effective technologies, and integrated biological and behavioral frameworks. These field-wide changes have led some to emphasize the differences among variants of CBT. Here, we draw attention to commonalities across cognitive-behavioral therapies, including shared goals, change principles, and therapeutic processes. Specifically, we offer a framework for examining common CBT characteristics that emphasizes behavioral adaptation as a unifying goal and three core change principles, namely (1) context engagement to promote adaptive imagining and enacting of new experiences; (2) attention change to promote adaptive sustaining, shifting, and broadening of attention; and (3) cognitive change to promote adaptive perspective taking on events so as to alter verbal meanings. Further, we argue that specific intervention components including behavioral exposure/activation, attention training, acceptance/tolerance, decentering/defusion, and cognitive reframing may be emphasized to a greater or lesser degree by different treatment packages but are still fundamentally common therapeutic processes that are present across approaches and are best understood by their relationships to these core CBT change principles. We conclude by arguing for shared methodological and design frameworks for investigating unique and common characteristics to advance a unified and strong voice for CBT in a widening, increasingly multimodal and interdisciplinary, intervention science. PMID:23611074

  18. Systematic evaluation of atmospheric chemistry-transport model CHIMERE

    NASA Astrophysics Data System (ADS)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene

    2017-04-01

    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM (http://www.lmd.polytechnique.fr/chimere/) is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment (http://www.lmd.polytechnique.fr/chimere/CW-articles.php). This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  19. An event-triggered control approach for the leader-tracking problem with heterogeneous agents

    NASA Astrophysics Data System (ADS)

    Garcia, Eloy; Cao, Yongcan; Casbeer, David W.

    2018-05-01

    This paper presents an event-triggered control and communication framework for the cooperative leader-tracking problem with communication constraints. Continuous communication among agents is not assumed in this work and decentralised event-based strategies are proposed for agents with heterogeneous linear dynamics. Also, the leader dynamics are unknown and only intermittent measurements of its states are obtained by a subset of the followers. The event-based method not only represents a way to restrict communication among agents, but it also provides a decentralised scheme for scheduling information broadcasts. Notably, each agent is able to determine its own broadcasting instants independently of any other agent in the network. In an extension, the case where transmission of information is affected by time-varying communication delays is addressed. Finally, positive lower-bounds on the inter-event time intervals are obtained in order to show that Zeno behaviour does not exist and, therefore, continuous exchange of information is never needed in this framework.

  20. Security Event Recognition for Visual Surveillance

    NASA Astrophysics Data System (ADS)

    Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.

    2017-05-01

    With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.

  1. Applying Kaplan-Meier to Item Response Data

    ERIC Educational Resources Information Center

    McNeish, Daniel

    2018-01-01

    Some IRT models can be equivalently modeled in alternative frameworks such as logistic regression. Logistic regression can also model time-to-event data, which concerns the probability of an event occurring over time. Using the relation between time-to-event models and logistic regression and the relation between logistic regression and IRT, this…

  2. Prior probability modulates anticipatory activity in category-specific areas.

    PubMed

    Trapp, Sabrina; Lepsien, Jöran; Kotz, Sonja A; Bar, Moshe

    2016-02-01

    Bayesian models are currently a dominant framework for describing human information processing. However, it is not clear yet how major tenets of this framework can be translated to brain processes. In this study, we addressed the neural underpinning of prior probability and its effect on anticipatory activity in category-specific areas. Before fMRI scanning, participants were trained in two behavioral sessions to learn the prior probability and correct order of visual events within a sequence. The events of each sequence included two different presentations of a geometric shape and one picture of either a house or a face, which appeared with either a high or a low likelihood. Each sequence was preceded by a cue that gave participants probabilistic information about which items to expect next. This allowed examining cue-related anticipatory modulation of activity as a function of prior probability in category-specific areas (fusiform face area and parahippocampal place area). Our findings show that activity in the fusiform face area was higher when faces had a higher prior probability. The finding of a difference between levels of expectations is consistent with graded, probabilistically modulated activity, but the data do not rule out the alternative explanation of a categorical neural response. Importantly, these differences were only visible during anticipation, and vanished at the time of stimulus presentation, calling for a functional distinction when considering the effects of prior probability. Finally, there were no anticipatory effects for houses in the parahippocampal place area, suggesting sensitivity to stimulus material when looking at effects of prediction.

  3. Which resilience of the continental rainfall-runoff chain?

    NASA Astrophysics Data System (ADS)

    Fraedrich, Klaus

    2015-04-01

    Processes along the continental rainfall-runoff chain are extremely variable over a wide range of time and space scales. A key societal question is the multiscale resilience of this chain. We argue that the adequate framework to tackle this question can be obtained by combining observations (ranging from minutes to decades) and minimalist concepts: (i) Rainfall exhibits 1/f-spectra if presented as binary events (tropics) and extrema world wide increase with duration according to Jennings' scaling law as simulated by a censored first-order autoregressive process representing vertical moisture fluxes. (ii) Runoff volatility (Yangtze) shows data collapse which, linked to an intra-annual 1/f-spectrum, is represented by a single function (Gumbel) not unlike physical systems at criticality, while short and long return times of extremes are Weibull-distributed. (iii) Soil moisture, interpreted by a biased coinflip Ansatz for rainfall events, provides an equation of state to the surface energy and water flux balances comprising Budyko's framework for quasi-stationary watershed analysis. (iv) Vegetation-greenness (NDVI), included as an active tracer extends Budyko's eco-hydrologic state space analysis, supplements the common geographical presentations, and it may be linked to a minimalist biodiversity concept. (v) Finally, attributions of change to external (or climate) and internal (or anthropogenic) causes are determined by eco-hydrologic state space trajectories using surface flux ratios of energy excess (loss by sensible heat over supply by net radiation) versus water excess (loss by discharge over gain by precipitation). Risk-estimates (by GCM-emulators) and possible policy advice mechanisms enter the outlook.

  4. Decision Aids Using Heterogeneous Intelligence Analysis

    DTIC Science & Technology

    2010-08-20

    developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY

  5. Modelling highly variable environmental factors to assess potential microbial respiration in complex floodplain landscapes

    PubMed Central

    Tritthart, Michael; Welti, Nina; Bondar-Kunze, Elisabeth; Pinay, Gilles; Hein, Thomas; Habersack, Helmut

    2011-01-01

    The hydrological exchange conditions strongly determine the biogeochemical dynamics in river systems. More specifically, the connectivity of surface waters between main channels and floodplains is directly controlling the delivery of organic matter and nutrients into the floodplains, where biogeochemical processes recycle them with high rates of activity. Hence, an in-depth understanding of the connectivity patterns between main channel and floodplains is important for the modelling of potential gas emissions in floodplain landscapes. A modelling framework that combines steady-state hydrodynamic simulations with long-term discharge hydrographs was developed to calculate water depths as well as statistical probabilities and event durations for every node of a computation mesh being connected to the main river. The modelling framework was applied to two study sites in the floodplains of the Austrian Danube River, East of Vienna. Validation of modelled flood events showed good agreement with gauge readings. Together with measured sediment properties, results of the validated connectivity model were used as basis for a predictive model yielding patterns of potential microbial respiration based on the best fit between characteristics of a number of sampling sites and the corresponding modelled parameters. Hot spots of potential microbial respiration were found in areas of lower connectivity if connected during higher discharges and areas of high water depths. PMID:27667961

  6. Novel Real-time Alignment and Calibration of the LHCb detector in Run2

    NASA Astrophysics Data System (ADS)

    Martinelli, Maurizio; LHCb Collaboration

    2017-10-01

    LHCb has introduced a novel real-time detector alignment and calibration strategy for LHC Run2. Data collected at the start of the fill are processed in a few minutes and used to update the alignment parameters, while the calibration constants are evaluated for each run. This procedure improves the quality of the online reconstruction. For example, the vertex locator is retracted and reinserted for stable beam conditions in each fill to be centred on the primary vertex position in the transverse plane. Consequently its position changes on a fill-by-fill basis. Critically, this new real-time alignment and calibration procedure allows identical constants to be used in the online and offline reconstruction, thus improving the correlation between triggered and offline-selected events. This offers the opportunity to optimise the event selection in the trigger by applying stronger constraints. The required computing time constraints are met thanks to a new dedicated framework using the multi-core farm infrastructure for the trigger. The motivation for a real-time alignment and calibration of the LHCb detector is discussed from both the operational and physics performance points of view. Specific challenges of this novel configuration are discussed, as well as the working procedures of the framework and its performance.

  7. Integration of a Self-Coherence Algorithm into DISAT for Forced Oscillation Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Tuffner, Francis K.; Amidan, Brett G.

    2015-03-03

    With the increasing number of phasor measurement units on the power system, behaviors typically not observable on the power system are becoming more apparent. Oscillatory behavior on the power system, notably forced oscillations, are one such behavior. However, the large amounts of data coming from the PMUs makes manually detecting and locating these oscillations difficult. To automate portions of the process, an oscillation detection routine was coded into the Data Integrity and Situational Awareness Tool (DISAT) framework. Integration into the DISAT framework allows forced oscillations to be detected and information about the event provided to operational engineers. The oscillation detectionmore » algorithm integrates with the data handling and atypical data detecting capabilities of DISAT, building off of a standard library of functions. This report details that integration with information on the algorithm, some implementation issues, and some sample results from the western United States’ power grid.« less

  8. A Conceptual framework of Strategy, Structure and Innovative Behaviour for the Development of a Dynamic Simulation Model

    NASA Astrophysics Data System (ADS)

    Konstantopoulos, Nikolaos; Trivellas, Panagiotis; Reklitis, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their failure to foster innovative behaviour by aligning business strategy and structure. From this point of view, the fit between strategy and structure is essential in order to facilitate firms' innovative behaviour. In the present paper, we adopt Porter's typology to operationalise business strategy (cost leadership, innovative and marketing differentiation, and focus). Organizational structure is built on four dimensions (centralization, formalization, complexity and employees' initiatives to implement new ideas). Innovativeness is measured as product innovation, process and technological innovation. This study provides the necessary theoretical framework for the development of a dynamic simulation method, although the simulation of social events is a quite difficult task, considering that there are so many alternatives (not all well understood).

  9. A Web-Based Framework For a Time-Domain Warehouse

    NASA Astrophysics Data System (ADS)

    Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.

    2009-09-01

    The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.

  10. Unraveling dynamics of human physical activity patterns in chronic pain conditions

    NASA Astrophysics Data System (ADS)

    Paraschiv-Ionescu, Anisoara; Buchser, Eric; Aminian, Kamiar

    2013-06-01

    Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a `signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.

  11. Data Assimilation Results from PLASMON

    NASA Astrophysics Data System (ADS)

    Jorgensen, A. M.; Lichtenberger, J.; Duffy, J.; Friedel, R. H.; Clilverd, M.; Heilig, B.; Vellante, M.; Manninen, J. K.; Raita, T.; Rodger, C. J.; Collier, A.; Reda, J.; Holzworth, R. H.; Ober, D. M.; Boudouridis, A.; Zesta, E.; Chi, P. J.

    2013-12-01

    VLF and magnetometer observations can be used to remotely sense the plasmasphere. VLF whistler waves can be used to measure the electron density and magnetic Field Line Resonance (FLR) measurements can be used to measure the mass density. In principle it is then possible to remotely map the plasmasphere with a network of ground-based stations which are also less expensive and more permanent than satellites. The PLASMON project, funded by the EU FP-7 program, is in the process of doing just this. A large number of ground-based observations will be input into a data assimilative framework which models the plasmasphere structure and dynamics. The data assimilation framework combines the Ensemble Kalman Filter with the Dynamic Global Core Plasma Model. In this presentation we will describe the plasmasphere model, the data assimilation approach that we have taken, PLASMON data and data assimilation results for specific events.

  12. Spectrum of Slip Processes on the Subduction Interface in a Continuum Framework Resolved by Rate-and State Dependent Friction and Adaptive Time Stepping

    NASA Astrophysics Data System (ADS)

    Herrendoerfer, R.; van Dinther, Y.; Gerya, T.

    2015-12-01

    To explore the relationships between subduction dynamics and the megathrust earthquake potential, we have recently developed a numerical model that bridges the gap between processes on geodynamic and earthquake cycle time scales. In a self-consistent, continuum-based framework including a visco-elasto-plastic constitutive relationship, cycles of megathrust earthquake-like ruptures were simulated through a purely slip rate-dependent friction, albeit with very low slip rates (van Dinther et al., JGR, 2013). In addition to much faster earthquakes, a range of aseismic slip processes operate at different time scales in nature. These aseismic processes likely accommodate a considerable amount of the plate convergence and are thus relevant in order to estimate the long-term seismic coupling and related hazard in subduction zones. To simulate and resolve this wide spectrum of slip processes, we innovatively implemented rate-and state dependent friction (RSF) and an adaptive time-stepping into our continuum framework. The RSF formulation, in contrast to our previous friction formulation, takes the dependency of frictional strength on a state variable into account. It thereby allows for continuous plastic yielding inside rate-weakening regions, which leads to aseismic slip. In contrast to the conventional RSF formulation, we relate slip velocities to strain rates and use an invariant formulation. Thus we do not require the a priori definition of infinitely thin, planar faults in a homogeneous elastic medium. With this new implementation of RSF, we succeed to produce consistent cycles of frictional instabilities. By changing the frictional parameter a, b, and the characteristic slip distance, we observe a transition from stable sliding to stick-slip behaviour. This transition is in general agreement with predictions from theoretical estimates of the nucleation size, thereby to first order validating our implementation. By incorporating adaptive time-stepping based on a fraction of characteristic slip distance over maximum slip velocity, we are able to resolve stick-slip events and increase computational speed. In this better resolved framework, we examine the role of aseismic slip on the megathrust cycle and its dependence on subduction velocity.

  13. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    PubMed

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insightsmore » from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of grid modernization activities. The availability of a grid HILF risk model, integrated across multi-hazard domains which, when interrogated, can support transparent, defensible and effective decisions, is an attractive prospect among these communities. In this report, we document an integrated HILF risk framework intended to inform the development of risk models. These models would be based on the systematic and comprehensive (to within scope) characterization of hazards to the level of detail required for modeling risk, identification of the stressors associated with the hazards (i.e., the means of impacting grid and supporting infrastructure), characterization of the vulnerability of assets to these stressors and the probabilities of asset compromise, the grid’s dynamic response to the asset failures, and assessment of subsequent severities of consequence with respect to selected impact metrics, such as power outage duration and geographic reach. Specifically, the current framework is being developed to;1. Provide the conceptual and overarching technical paradigms for the development of risk models; 2. Identify the classes of models required to implement the framework - providing examples of existing models, and also identifying where modeling gaps exist; 3. Identify the types of data required, addressing circumstances under which data are sparse and the formal elicitation of informed judgment might be required; and 4. Identify means by which the resultant risk models might be interrogated to form the necessary basis for risk management.« less

  15. Setting the stage to advance the adverse outcome pathway (AOP) framework through horizon scanning

    EPA Science Inventory

    Recognizing the international interest surrounding the adverse outcome pathway framework, which captures existing information describing causal linkages between a molecular initiating event through levels of biological organization to an adverse outcome of regulatory significance...

  16. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Mialle, P.

    2015-12-01

    The IDC collects waveforms from a global network of infrasound sensors maintained by the IMS, and automatically detects signal onsets and associates them to form event hypotheses. However, a large number of signal onsets are due to local clutter sources such as microbaroms (from standing waves in the oceans), waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NET-VISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydro-acoustic and infrasound processing built on a unified probabilistic framework. Notes: The attached figure shows all the unassociated arrivals detected at IMS station I09BR for 2012 distributed by azimuth and center frequency. (The title displays the bandwidth of the kernel density estimate along the azimuth and frequency dimensions).This plot shows multiple micro-barom sources as well as other sources of infrasound clutter. A diverse clutter-field such as this one is quite common for most IMS infrasound stations, and it highlights the dangers of forming events without due consideration of this source of noise. References: [1] Infrasound categorization Towards a statistics-based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NET-VISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013.

  17. Combating terrorism : selected challenges and related recommendations

    DOT National Transportation Integrated Search

    2001-01-01

    This General Accounting Office (GAO) report was already scheduled for release before the events of September 11. The report summarizes federal efforts to combat terrorism prior to these events. This report assesses (1) the current framework for leade...

  18. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    NASA Astrophysics Data System (ADS)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  19. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  20. Physics process level discrimination of detections for GATE: assessment of contamination in SPECT and spurious activity in PET.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Vandenberghe, Stefaan; Verhaeghe, Jeroen; Van Holen, Roel; Rault, Erwann; Lemahieu, Ignace

    2009-04-01

    The GEANT4 application for tomographic emission (GATE) is one of the most detailed Monte Carlo simulation tools for SPECT and PET. It allows for realistic phantoms, complex decay schemes, and a large variety of detector geometries. However, only a fraction of the information in each particle history is available for postprocessing. In order to extend the analysis capabilities of GATE, a flexible framework was developed. This framework allows all detected events to be subdivided according to their type: In PET, true coincidences from others, and in SPECT, geometrically collimated photons from others. The framework of the authors can be applied to any isotope, phantom, and detector geometry available in GATE. It is designed to enhance the usability of GATE for the study of contamination and for the investigation of the properties of current and future prototype detectors. The authors apply the framework to a case study of Bexxar, first assuming labeling with 124I, then with 131I. It is shown that with 124I PET, results with an optimized window improve upon those with the standard window but achieve less than half of the ideal improvement. Nevertheless, 124I PET shows improved resolution compared to 131I SPECT with triple-energy-window scatter correction.

Top